End Users Should Define the CEP Market.

My friend Opher mistakenly thought I was thinking of him when I related the story of the fish, as he replied, CEP and the Story of the Captured Traveller.

I must not have related the fish story very well, because to understood the story of the fish, is to know that we are all like the fish, in certain aspects of life, and there is nothing negative to be gleaned from the story.

However, to Opher’s point on CEP, I disagree.   Just because the marketing people (not the market) has misdefined CEP and therefore the vendors are drifting from the technology described in Dr. Luckham’s original CEP work, including his CEP book, we should not change the context of CEP.    Therefore, I don’t agree we should redefine CEP, as David envisioned, as Intelligent Event Processing (IEP) because CEP, as today’s software vendors sell it, is really SEP (or whatever!)  Please recall that David’s background at Stanford was AI and he did not define CEP as the software vendors have defined it either!

The fact of the matter is that the software marketing folks have decided they are going to use Dr. Luckham’s book to sell software that does not perform as Dr. Luckham described or envisioned!   I make no apologies for being on the side of end users who actually need to solve complex problems, not sell software that underperforms.

As I mentioned, this positioning and repositioning does not help solve complex problems.   At the end of the day, we have problems to solve and the software community is not very helpful when they place form over substance, consistently. 

Furthermore, as most customers are saying, time and time again, “so what?” … “these COTS event processing platforms with simple joins, selects and rules do not solve my complex event processing problems.”  “We already have similar approaches, where we have spent millions of dollars, and they do not work well.”

In other words, the market is crying out for true COTS CEP solutions, but the software community is not yet delivering.  OBTW, this is nothing new.  In my first briefing to the EP community in January of 2006, I mentioned that CEP required stating the business problem, or domain problem, and then selecting the method or methods that best solve the problem or problems.

To date, the CEP community has not done this because they have no COTS tool set other than SEP engines (marketed as either ESP engines or CEP engines – and at least ESP was closer to being technically accurate.) 

Experienced end users are very intelligent. 

These end users know the complex event processing problems they need to solve; and they know the limitations of the current COTS approaches marketed by the CEP community.  Even in Thailand, a country many of you might mistakenly think is not very advanced technologically, there are experts in telecommunications (who run large networks) who are working on very difficult fraud detection applications, and they use neural networks and say the results are very good.   However, there is not one CEP vendor, that I know of, who offers true CEP capability in the form of neural nets.  

Almost every major bank, telco, etc. has the same opinion, and the same problem. They need much more capability than streaming joins, selects and rules to solve their complex event processing problems that Dr. Luckham outlined in his book.   The software vendors are attempting to define the CEP market to match their capability; unfortunately, their capabilities do not meet the requirements of the vast majority of end users who have CEP problems to solve.

If the current CEP platforms were truely solving complex event processing problems, annual sales would be orders of magnitudes higher.  Hence, the users have already voted.   The problem is that the CEP community is not listening.


4 Responses to End Users Should Define the CEP Market.

  1. peter lin says:

    Using neural nets sounds fascinating and quite powerful. I wonder if the algorithms and techniques are similar to what researchers used in the grand challenge. I remember reading about learning machines when I was a young boy and thought, “how do you make a machine learn?”

    This might be day dreaming on my part, but I wonder if the techniques described in Stanford’s paper on Stanley are applicable to network intrusion detection? It seems like both cases there’s alot of noise and dynamic patterns the system has to recognize. For example, if I teach a car to drive in CA and all the lanes are painted with white lines. What happens if I take the car to france? Would the system be able to adapt, or would it drive off the road. I remember reading some old papers about early experiments to teach system to drive a car. Getting the car to steer by itself wasn’t so hard, but only on one type of roads. Once the roads changed, the system wasn’t able to adapt fast enough.

    I imagine network intrusion detection must adapt in a similar way. If the attackers are using one approach this month, but switch to a new technique next month, can the system figure that out? Fascinating stuff.

  2. Tim Bass says:

    Dear Peter,

    Yes, almost all “real and practical” event processing problems must deal with noise, uncertainly and change. Much like the real world, only a tiny subset of event processing problems are “noiseless” or “static”.

    For example, when I used to work for TIBCO, many of our field personnel were asked “can your CEP software learn?” The reason is that most people who have operational in detection-oriented problems have “been down that road before.”

    I recently met with a senior VP of a bank, in the fraud and risk management area, who was very skeptical of CEP because his current “65 systems” do not work very well, if at all. Of course, they must have these “not good detection systems” to satisfy regulators, so there is a market, but the systems don’t work very well.

    Also, I met with a CTO of a top telecommunications company recently. He almost laughed (well, he did laugh, but politely) about rule-based approaches to detecting threats in his large network. His team has implemented a NN-based approach that processes events. He would never consider any of the current CEP vendors because the underlying methods, algorithms, etc. of the existing CEP software has already been proven not to work many times over.

    Creating a declarative programming model for engines that only support a form (or subset) of simple event processing does not solve complex problems. The literature is quite mature on the challenges in detection-oriented IT solutions – there is no need, nor value, for the CEP community to redefine complexity and complex systems.

    Since CEP has been positioned to “detect both opportunties and threats” in real-time, this is quite a statement to make; especially considering that most of the CEP vendors have little track record or experience in detection-oriented problems that involve complexity.

    Complexity is a term that existed long before the CEP software vendors jumped on the bandwagon. I’ll continue in a post when I have more free time. There are IT guys in my room!

    Yours faithfully, Tim

  3. PatternStorm says:

    Hi Tim,

    “…Just because the marketing people (not the market) has misdefined CEP and therefore the vendors are drifting from the technology described in Dr. Luckham’s original CEP work, including his CEP book, we should not change the context of CEP…”

    Couldn’t agree more! continuous queries != CEP



  4. Tim Bass says:

    Hi Claudi,

    Thanks for visiting!

    BTW, Opher and I seem to have “agreed to disagree” over at David’s forum:


    Yours faithfully, Tim

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: