The Infant, the Elephant and the Intelligent Event

June 27, 2008

Fellow blogger Opher Etzion, replies to  On Elephants and Analytics with On Unicorn, Professor and Infant.   Opher is kindly giving us another metaphor to consider, the Infant and the Profession, since we are both big fans of big gentle elephants, babies and our universities.  

Opher and I agree that Infants are not Professors, and we also agree that CEP is in its Infancy and there is overhype by folks often implying CEP is a Professor.     So it seems we all have a huge elephant in the room with an Infant Professor hanging on the end of a wildly swinging Elephant’s trunk!

To keep the blogopoints interesting, I should point out that with all this agreement and Kumbaya campfire singing, there are a couple of things I do disagree with in Opher’s amusing counterpoint. 

First of all, Opher uses the well know debate technique of falsely attributing some easily refutable discussion point and then offering a slam dunk counterpoint.   He does this in this clever, but completely inaccurate Opher quote,

 “I [Opher] respectfully disagree with Tim … in his claim that what has been done until today is just hype and hence totally worthless…”

Folks reading my blog know that I have never said “what has been done until today is … totally worthless.”    This is a misfortunate misquote.  Shame on you Opher!  

What I said, easily read in the blog, was that CEP is overhyped and that most of the self-described CEP software on the market today does not live up to the inflated claims we read and hear from CEP software vendors, the analysts and reporters they influence.

The second counterpoint that I find interesting is Opher’s consistent attempt to redress the dramatic lack of capability and analytics in current generation self-described CEP software by repositioning CEP as “intelligent event processing” (IEP) as he is continues in On Intelligent Event Processing.   

Perhaps Opher will be successful in repositioning the vast majority of the original CEP problem space as IEP.   This is a interesting slippery slope, in my opinion.   The new positioning that Opher is offering is that when “event processing” has advanced analytics, it is not CEP anymore, it becomes IEP because CEP is really “Simple Event Processing” (SEP) – event processing with little to no analytical capability.

I don’t know about most of our readers, but all this positioning and repositioning to match the capabilities, or lack of capabilities, in the current portfolio of self-described CEP software vendors is fascinating.

Here is the next logical question is:

What is the difference between a “Complex Event” and an “Intelligent Event” ?

This could get quite interesting, so stay tuned!


The Predictive Battlespace

June 11, 2008

Friend and colleague Don Adams, CTO World Wide Public Sector, TIBCO Software, explains how CEP can be used to sense, adapt and respond to complex situations in The “Predictive” Battlespace: Leveraging the Power of Event-Driven Architecture in Defense


Clouding and Confusing the CEP Community

April 20, 2008

Ironically, our favorite software vendors have decided, in a nutshell, to redefine Dr. David Luckham’s definition of “event cloud” to match the lack-of-capabilities in their products.  

This is really funny, if you think about it. 

The definition of “event cloud” was coordinated over a long (over two year) period with the leading vendors in the event processing community and is based on the same concepts in David’s book, The Power of Events. 

But, since the stream-processing oriented vendors do not yet have the analytical capability to discover unknown causal relationship in contextually complex data sets, they have chosen to reduce and redefine the term “event cloud” to match their product’s lack-of-capability.  Why not simply admit they can only process a subdomain of the CEP space as defined by both Dr. Luckham and the CEP community-at-large? 

What’s the big deal?   Stream processing is a perfectly respectable profession!

David, along with the “event processing community,” defined the term “event cloud” as follows:

Event cloud: a partially ordered set of events (poset), either bounded or unbounded, where the partial orderings are imposed by the causal, timing and other relationships between the events.

Notes: Typically an event cloud is created by the events produced by one or more distributed systems. An event cloud may contain many event types, event streams and event channels. The difference between a cloud and a stream is that there is no event relationship that totally orders the events in a cloud. A stream is a cloud, but the converse is not necessarily true.

Note: CEP usually refers to event processing that assumes an event cloud as input, and thereby can make no assumptions about the arrival order of events.

Oddly enough, quite a few event processing vendors seem to have succeeded at confusing their customers, as evident in this post, Abstracting the CEP Engine, where a customer has seemingly been convinced by the (disinformational) marketing pitches  – “there are no clouds of events, only ordered streams.”

I think the problem is that folks are not comfortable with uncertainty and hidden causal relationships, so they give the standard “let’s run a calculation over a stream” example and state “that is all there is…” confusing the customers who know there is more to solving complex event processing problems.

So, let’s make this simple (we hope). referencing the invited keynote at DEBS 2007, Mythbusters: Event Stream Processing Versus Complex Event Processing.

In a nutshell…. (these examples are in the PDF above, BTW)

The set of market data from Citigroup (C) is an example of multiple “event streams.”

The set of all events that influence the NASDAQ is an “event cloud”.

Why?

Because a stream  of market data is a linear ordered set of data related by the timestamp of each transaction linked (relatively speaking) in context because it is Citigroup market data.    So, event processing software can process a stream of market data, perform a VWAP if they chose, and estimate a good time to enter and exit the market.  This is “good”.

However, the same software, at this point in time, cannot process many market data feeds in NASDAQ and provide a reasonable estimate of why the market moved a certain direction based on a statistical analysis of a large set of event data where the cause-and-effect features (in this case, relationships) are difficult to extract.  (BTW, this is generally called “feature extraction” in the scientific community.)

Why?

Because the current-state-of-the-art of stream-processing oriented event processing software do not perform the required backwards chaining to infer causality from large sets of data where causality is unknown, undiscovered and uncertain.

Forward chaining, continuous query, time series analytics across sliding time windows of streaming data can only perform a subset of the overall CEP domain as defined by Dr. Luckham et al.

It is really that simple.   Why cloud and confuse the community?

We like forward chaining using continuous queries and time series analysis across sliding time windows of streaming data. 

  • There is nothing dishonorable about forward chaining using continuous queries and time series analysis across sliding time windows of streaming data.   
  • There is nothing wrong with forward chaining using continuous queries and time series analysis across sliding time windows of streaming data. 
  • There is nothing embarrassing about forward chaining using continuous queries and time series analysis across sliding time windows of streaming data. 

Forward chaining using continuous queries and time series analysis across sliding time windows of streaming data is a subset of the CEP space, just like the definition above, repeated below:

The difference between a cloud and a stream is that there is no event relationship that totally orders the events in a cloud. A stream is a cloud, but the converse is not necessarily true.

It is really simple.   Why cloud a concept so simple and so accurate?


Scheduling Agents with Rules Engines

April 5, 2008

Paul Vincent of TIBCO talks about agents in his post, CEP and Agents…

At the core, TIBCO’s BusinessEvents is RETE-based rules engine and rules engines are well suited for scheduling problems.  This makes perfect sense, since many of TIBCO’s customers deploy BusinessEvents in scheduling-oriented, not detection-oriented, solutions.

It begs to be pointed out, however, that scheduling is only one component of a CEP architecture. 

Normally, the scheduling component of a distributed event processing architecture manages the intelligent scheduling of the sharing of data between distributed agents that are running a variety of analytics.

Simply stated, all agents are not rules engines; however, rules engines are often used to schedule the cooperation between analytical agents in a distributed agent-based architecture.


Military Event Processing Requirements and COTS CEP Software

March 8, 2008

In Q&A from BCS SPA meeting on CEP,  friend and colleague Paul Vincent says:

 “AFAIK there are no current military systems (as opposed to government intelligence systems) using Commercial Off The Shelf CEP systems, although I recall one commercial product being developed with US military money (your tax $ at work, etc etc).”

Actually, Paul’s statement is slightly misleading.   Companies like StreamBase and AgentLogic have their roots in supporting the military.  In addition, IBM has a number of event processing related solutions in the military.   (There are also others, we suspect.)

It is true, however, that current generation COTS CEP engines do not have the advanced event processing capabilities required for most CEP applications  in the military; but as CEP engines advance, this should change.


The 2007 CEP Blog Awards

January 20, 2008

Here are the CEP Blog Awards for 2007, based on the three categories outlined in An Overture to the 2007 CEP Blog Awards.

The CEP Blog Award for Rule-Based Event Processing

Winner: TIBCO Software

TIBCO has a very robust and sophisticated progress-oriented event processing product, TIBCO BusinessEvents, with a proven event processing customer base. TIBCO has a rich complimentary software suite for business process and enterprise integration, management, visualization, personalization and optimization. TIBCO has been in business for many years and has a global reach for both sales and professional services.

The CEP Blog Award for Event Stream Processing

Winner: Progress Apama

Similar to TIBCO in middleware status, Progress Apama has a strong event stream processing product with a proven customer base.  Progress has complimentary software suites for business process and enterprise integration.  Progress also has been in business for many years and has a global reach for both sales and professional services.

The CEP Blog Award for Advanced Event Processing

Winner: Reserved.

None of the software companies currently marketing themselves as event processing platforms meet the our criteria for the Advanced Event Processing award in 2007.


An Overture to the 2007 CEP Blog Awards

January 9, 2008

Before announcing the winners of the 2007 CEP Blog Awards I thought it would be helpful to introduce the award categories to our readers.

I have given considerable thought to how to structure The CEP Blog Awards. This was not an easy task, as you might imagine, given the confusion in the event processing marketspace. So here goes.

For the 2007 CEP Blog Awards I have created three event processing categories. Here are the categories and a brief description of each one:

The CEP Blog Award for Rule-Based Event Processing

Preface: I was also inclined to call this category “process-based event processing” or “control-based event processing” and might actually do so in the future. As always, your comments and feedback are important and appreciated.

Rule-based (or process-based) event processing is a major subcategory of event processing. Rule-based approaches to event processing are very useful for stateful event-driven process control, track and trace, dynamic resource management and basic pattern detection (see slide 12 of this presentation). Rule-based approaches are optimal for a wide-range of production-related event processing systems.

However, just like any system, there are engineering trade-offs using this approach. Rule-based systems tend not to scale well when the number of rules (facts) are large. Rule-based approaches can also be difficult to manage in a distributed multi-designer environment. Moreover, rule-based approaches are suboptimal for self-learning and tend not to process uncertainty very well. Never the less, rule-based event processing is a very important CEP category.

The CEP Blog Award for Event Stream Processing

Stream-centric approaches to event processing are also a very important overall category of event processing. Unlike a stateful, process-driven rule-based approach, event stream processing optimizes high performance continuous queries over sliding time windows. High performance, low latency event processing is one of the main design goals for many stream processing engines.

Continuous queries over event streams are genenerally designed to be executed in milliseconds, seconds and perhaps a bit longer time intervals. Process-driven event processing, on the other hand, can manage processes, resources, states and patterns over long time intervals, for example, hours and days, not just milliseconds and seconds.

Therefore, event stream processing tends to be optimized for a different set of problems than process-based (which I am calling rule-based this year) event processing. Similar to rule or process-based approaches, most current stream processing engines do not manage or deal with probability, likelihood and uncertainty very well (if at all).

The CEP Blog Award for Advanced Event Processing

For a lack of a better term, I call this category advanced event processing. Advanced event processing will more-than-likely have a rule-based and/or a stream-based event processing component. However, to be categorized as advanced event processing software the software platform must also be able to perform more advanced event processing that can deal with probability, fuzzy logic and/or uncertainty. Event processing software in this category should also have the capability to automatically learn, or be trained, similar to artificial neural networks (ANNs).

Some of my good colleagues might prefer to call this category AI-capable event processing (or intelligent event processing), but I prefer to call this award category advanced event processing for the 2007 awards. If you like the term intelligent event processing, let’s talk about this in 2008!

Ideally, advanced event processing software should have plug-in modules that permit the event processing architect, or systems programmer, to select and configure one or more different analytical methods at design-time. The results from one method should be available to other methods, for example the output of a stream processing module might be the input to a neural network (NN) or Bayesian Belief (BN) module. In another example pipeline operation, the output of a Bayesian classifier could be the input to a process or rule-based event processing module within the same run-time environment.

For all three categories for 2007, there should be a graphical user interface for design-time construction and modeling. There should also be a robust run-time environment and most, if not all, of the other “goodies” that we expect from event processing platforms.

Most importantly, there should be reference customers for the software and the company. The CEP Blog Awards will be only given to companies with a proven and public customer base.

In my next post on this topic, I’ll name the Awardees for 2007. Thank you for standing by. If you have any questions or comments, please contact me directly.