CEP Blogosphere – Mid May Roundup

May 15, 2007

Posted by: Tim Bass

May has been an interesting month around the CEP blogosphere. Mark Palmer and the Apama team continues to lead the field with informative, interesting and provocative posts. I would like to personally commend Mark for his leadership in the community-at-large.

TIBCO’s CEP blog has been active thanks to posts by Alan Lundberg, Paul Vincent and (formerly) me. Brenda Michelson’s post, Event Processing Conversation Shifts from Research to Practitioners, did an excellent job listing the major event processing blogs in the ole’ blogosphere. David Luckham and Roy Schulte published the draft event processing glossary based as part of ongoing work in the EPST.

Khanderao, an Architect at Oracle Corp, entered the field with an inaccurate blog post related to CEP. In this post, Khanderao falls into the same old “database-centric trap” of misrepresenting CEP as just another high speed, continuous query event (data) stream processing technology. This mistaken view appears to be an ongoing “database versus distributed messaging” memetic remnant.

Marco of RuleCore has been unusually quiet lately after an excellent post back in March on My Two Types of Event Processing, where Marco did a nice job of describing his views on two event processing styles; time series filtering and track and trace. At the end of April, Mark Tsimelzon of Coral8 blogs a bit of memetic frustration over technical discussions around similar discussions of ESP and CEP.

The normally lively CEP-Interest discussion group has been quiet as activity shifts from email forums to the CEP blogosphere. Maybe we will see more event processing blogging activity from the Dagstuhl seminar on Event Processing – May 6-11, 2007 like the one just in (updated earlier post) from Marco or the Wiki by Claudi?

Copyright © 2007 by Tim Bass, All Rights Reserved.

Advertisements

CEP History – CEP Engines Continue to Evolve

May 15, 2007

Posted by: Tim Bass


From a commercialization perspective, Mark Palmer and John Bates of Progress Apama have done a excellent job of beginning to document the history of CEP.

One of the biggest problems prior to commercialization of event processing engines was that each event processing application was yet-another hand-made custom implementation.

For example, in the military, event processing history goes back many years prior to the (commercial) history that Mark and John kindly offer, and there are a number of books on the topic, and in particular, the sensor fusion area, where processing distributed sensor data is the same as processing events, the events just happen to be sensor data. The implementations were (and still are) expensive, custom built event processing applications.

Also, event processing was also a big part of the heritage of large-scale modelling and simulations, and much of the genesis of David Luckham’s DARPA funded Rapide and CEP work follows that school
of thought. Similarly, prior to the commercialization of CEP and ESP engines, these applications were (and still are) very expensive custom implementions. In many cases today’s CEP engines are not yet advanced enough to be “white boxes” for extremely challenging command-and-control (C2) event processing applications.

One of the reasons is that all of these “extremely complex” event processing applications have a very large challenge, for example, task scheduling between distributed, heterogeneous event sources. Most commercial CEP and ESP engines are still relatively immature in their ability to manage the complexity of distributed computing scheduling tasks.

In addition, when event processing requires specialization, coooperation, distributed object caching and distribution between distributed CEP engines (event processing agents) another layer of scheduling complexity arises.

So, while we would all agree that we have come a long way in the commercialization of CEP and ESP engines, we would also agree that we have a long way to go as well.

Copyright © 2007 by Tim Bass, All Rights Reserved.


What is Complex Event Processing? (Part 6)

May 15, 2007

Posted by: Tim Bass

In What is Complex Event Processing? (Part 5), we discussed situation refinement, the functional component of event processing that describes refining multiple event objects in order to estimate and identify business situations and scenarios in real-time. Today, in Part 6 of What is Complex Event Processing, we discuss impact assessment – where detected business situations are compared, correlated, and/or analyzed in “what if” type of scenarios to determine and predict business consequences.

Event Processing Reference Architecture

After event detection and situation refinement, businesses are very concerned with ascertaining or predicting outcomes and financial gains or losses if a detected situational threat or opportunity materializes. Impact assessment is the functional component of event processing that is focused on the estimation and prediction of the priority, utility or cost of an estimated business situation, complex event or scenario.

At this stage of the CEP reference model (above), we estimate the impact of an assessed situation, which includes likelihood and/or cost/utility measures associated with potential outcomes. From this inference, loss projections and liabilities (or gains) may> In addition, resource allocation and processing priorities may be estimated.

Opportunities and threats in business generally need to be predicted based upon an estimate of the current situation, known plans and predicted reactions. Example of real-time predictive types of business use cases are:

– determining the expected consequences of a fraudsters actions in an ecommerce scenario given the current estimated threat to the business;

– estimate the consequence of a failure in a distributed computing application and the effects on other systems that interact with the failed component;

 

– estimating the potential profit if an algorithmic trade is executed on a tracked equity or basket of equities;

 

– predicting how delays in shipping effect the supply chain, including consumer choices and behavior;

– predicting network congestion and outages based on specific patterns of anomalous network behavior in real-time;

 

– assessing risk and losses in a potential aircraft collision based on information about the planes, the location and their cargo or passengers;

 

– predicting the impact of a viral epidemic on different geographic areas and populations;

 

– predicting costs saving based on optimizing network resources in a transportation or supply chain network; or,

 

– predicting potential losses if an identified class of missile reaches its projected target.

Impact assessment generally requires real-time correlation of historical data which resides in databases. This is represented by the Database Management component of the event processing reference architecture.

In my next post, What Is Complex Event Processing, Part 7, we will discuss another important area in CEP, process refinement – actions taken, parameters adjusted, resources allocated (for example) based on detected (and/or predicted) business situations and scenarios.

Copyright © 2007 by Tim Bass, All Rights Reserved.