BEA Enters the CEP Market with Weblogic Event Server

May 29, 2007

Posted By Tim Bass

At the last meeting of the Event Processing Technical Society I enjoyed excellent discussions about event processing with engineers from BEA, who were very interested in CEP. Now, it comes as no surprise that BEA has entered the CEP space by unveiling the BEA Weblogic Event Server:

“Churchward said WebLogic Event Server fits perfectly because it uses Java, which doesn’t require compilations every time the customer wants to change a rule. The software handles 50,000 complex events per second and applies 10,000 rules against those events.

This lets customers build their applications on the platform without integrating a CEP engine with a separate general purpose platform; in this regard, the server saves businesses the acquisition costs of having to buy a separate CEP engine.

WebLogic Event Server also supports simple Java (POJO) programming and the Spring Framework, as well as an Event Processing Language (EPL) that augments and extends SQL for event processing.”

Soon, we will know more details, including what flavor of rules engine BEA is using. Here is what BEA says on their web site:

BEA WebLogic Event Server is the first and only Java container for high-performance event-driven applications. These applications have one or more of the following characteristics:

  • Some or all application inputs are “events” – meaningful state changes that are unpredictable and that potentially trigger other activities.
  • Large volumes of streaming events.
  • Need to discern complex event patterns or correlations across different event sources.
  • Need to discern complex event patterns or correlations across time.
  • Need to respond in real time.
  • Highly predictable response times.
  • Must be developed in a standard Java environment to decrease total cost of adoption/ownership and increase portability and modularity.

It has not been revealed (yet) if BEA is using an evolution of the Rete algorithm. Maybe BEA will start contributing to the lively engine-related discussions in this area, now that they have unveiled their entry into the CEP space!

BTW: From this link, we can see that BEA is using SL for a part of their CEP dashboarding strategy.


IBM CEP Patent Application 20070118545 – Visible May 24, 2007

May 25, 2007

Posted by Tim Bass

IBM filed (corrected earlier patent application status error) a CEP patent, 20070118545, on November 21st, 2005, now viewable by the public on May 24th, 2007, titled, Dynamic business process integration using complex event processing. The patent abstract states:

An enterprise application integration broker for managing a number of applications. The enterprise application integration broker includes a complex event processing engine. The complex event processing engine is adapted to monitor and analyze a first set of events in at least one of the plurality of applications. In addition, the enterprise application integration broker includes an integration engine. The integration engine is connected to the complex event processing engine and is connected to each of the applications. The integration engine is adapted to cause at least one application to react to a first set of events occurring in one or more of the plurality of applications. The integration engine is further adapted to cause at least one application to react to a second set of events generated by the complex event processing engine. The second set of events is correlated with the first set of events.

Not being an expert on US patent law, I don’t understand how IBM could be granted a patent for an EAI technology that was documented openly in the academic literature a number of years ago, and has since been implemented by numerous commercial companies.

How does this new patent effect the emerging CEP market?

PS: A special word of thanks to Brian Connell of WestGlobal for pointing out the correct status of this CEP patent application!. Thanks Brian!


What is Complex Event Processing? (Part 7)

May 19, 2007

Posted by Tim Bass

In What is Complex Event Processing? (Part 6), we discussed impact assessment in event processing applications. Today, we introduce process refinement – the feedback loop, resource management and work flow components of event processing architectures. Business process management (BPM) also is a part of process refinement, depicted in slide 13 of my March 14, 2007 IDC presentation in Lisbon, Portugal.

This overarching event-correlation-assessment-decision-action concept in CEP is also discussed very nicely by John Bates and Giles Nelson, Progress Apama, in their on-demand webinar, Using CEP and BAM for Fraud & Compliance Monitoring. John and Giles do an excellent job covering a number of fraud detection, risk and compliance use cases that illustrate how raw events, like RFID tags in poker chips, are correlated in real-time to effect actionable processes. In addition, Paul Vincent, TIBCO Software, does a nice job illustrating another application of direct feedback based on real-time analytics in his blog post, CEP and the alphabet soup (Part 2): BI.

Event Processing Reference Architecture

In a nutshell, we can easily see that all of the components of the CEP functional reference architecture we discussed earlier, events, event pre-processing, event refinement, situational refinement and impact assessment, add value only if they lead to high confidence, resource efficient actions. This is one of the motivations behind David Luckham’s recently posted white paper, SOA, EDA, BPM and CEP are all Complementary.

Process refinement is the functional component of event processing that takes action based on detected situations and predicted impacts.

Examples of process refinement in real-time are:

* executing a trade in equities based on a series of events that lead to a high yield opportunity;

* alerting security by initiating incident workflow in an on-line e-commerce application when the likelihood of fraudulent behaviour and loss potential is high;

* notifying downstream suppliers and customers when an actionable exceptional condition was detected in the supply chain;

* adding a new firewall rule when high-confidence anomalous behaviour in detected on the network;

* notifying airlines, the FAA and the media as early as possible when a real time air disaster may be about to happen;

* automatically moving a camera in a casino to game tables where suspicious dealings have been detected while notifying security; or,

* turning on sensors ahead of the projected path, while turning off sensors behind the historical path, of a long-range missile in flight.

Alan Lundberg and I referred to this as event-decision architecture, when we collaborated on my keynote at the First Workshop on Event ProcessingProcessing Patterns for PredictiveBusiness, Other folks on the net, Brenda Michelson for example, refer to the process we are describing as a business-decision architecture.

What is important to note is that the overall goal of processing events is to take raw events as input and process the events to detect actionable situations with high confidence; and then affect the right process, with the right action, at the right time, as James Taylor correctly points out in his post on business decisioning.

In my next post in this series, What Is Complex Event Processing? Part 8, we will review another important aspect of event processing, visualization and the user interfaces to the components of the CEP reference architecture.

Copyright © 2007 by Tim Bass, All Rights Reserved.


Event Streams and Event Clouds Revisited

May 17, 2007

Posted by: Tim Bass

Earlier on Yahoo! CEP-Interest we discussed events streams and event clouds in the context of POSETS. This discussion was in two parts; Part I, Clouds (Partially Order Sets) – Streams (Linearly Ordered Sets) and Part II: Clouds (Partially Order Sets) – Streams (Linearly Ordered Sets). I encourage you to review the posts if you are interested in understanding the underlying theory behind event steams and event clouds.

In addition, the draft event processing glossary I mentioned in an early post has definitions of event clouds and event streams, as follows:

Event stream: a linearly ordered sequence of events.

Notes: Usually, streams are ordered by time, e.g., arrival time. An event stream may be bounded by a certain time interval or other criteria (content, space, source), or be open ended and unbounded. A stream may contain events of many different types.

Event cloud: a partially ordered set of events (poset), either bounded or unbounded, where the partial orderings are imposed by the causal, timing and other relationships between the events.

Notes: Typically an event cloud is created by the events produced by one or more distributed systems. An event cloud may contain many event types, event streams and event channels. The difference between a cloud and a stream is that there is no event relationship that totally orders the events in a cloud. A stream is a cloud, but the converse is not necessarily true.

Note: CEP usually refers to event processing that assumes an event cloud as input, and thereby can make no assumptions about the arrival order of events.

However, there is a minor detail missing from the above definition of an event cloud which might lead to confusion, and that missing detail is related to the note, “the arrival order of events”. In this note, the reader could mistakenly come to the conclusion that a set of events that were TOSETS ordered by time when they were created, for example, become POSETS if they arrive at an event processing agent out-of-order.

For example, let us take a set of integers, 1 throught 10. This set of integers is a TOSET {1, 2, 3, 4, 5, 6, 7, 8, 9, 10} (This set also happens to be a POSET.) Let’s us take this set of integers and place them in a box and shake them up a bit; and then take them out of the box one-by-one, with a blindfold on; the results are {5, 4, 1, 2, 8, 3, 10, 7, 6, 9}.

Is this set of integers still a POSET? (hint)

Is the set still a TOSET?

The answer to this basic question addresses, fundamentally, one of the issues with the wording of the note in the draft definition of event cloud (above). I will address this in a later post. Comments? Paris, what do you think?

Copyright © 2007 by Tim Bass, All Rights Reserved.


CEP Blogosphere – Mid May Roundup

May 15, 2007

Posted by: Tim Bass

May has been an interesting month around the CEP blogosphere. Mark Palmer and the Apama team continues to lead the field with informative, interesting and provocative posts. I would like to personally commend Mark for his leadership in the community-at-large.

TIBCO’s CEP blog has been active thanks to posts by Alan Lundberg, Paul Vincent and (formerly) me. Brenda Michelson’s post, Event Processing Conversation Shifts from Research to Practitioners, did an excellent job listing the major event processing blogs in the ole’ blogosphere. David Luckham and Roy Schulte published the draft event processing glossary based as part of ongoing work in the EPST.

Khanderao, an Architect at Oracle Corp, entered the field with an inaccurate blog post related to CEP. In this post, Khanderao falls into the same old “database-centric trap” of misrepresenting CEP as just another high speed, continuous query event (data) stream processing technology. This mistaken view appears to be an ongoing “database versus distributed messaging” memetic remnant.

Marco of RuleCore has been unusually quiet lately after an excellent post back in March on My Two Types of Event Processing, where Marco did a nice job of describing his views on two event processing styles; time series filtering and track and trace. At the end of April, Mark Tsimelzon of Coral8 blogs a bit of memetic frustration over technical discussions around similar discussions of ESP and CEP.

The normally lively CEP-Interest discussion group has been quiet as activity shifts from email forums to the CEP blogosphere. Maybe we will see more event processing blogging activity from the Dagstuhl seminar on Event Processing – May 6-11, 2007 like the one just in (updated earlier post) from Marco or the Wiki by Claudi?

Copyright © 2007 by Tim Bass, All Rights Reserved.


CEP History – CEP Engines Continue to Evolve

May 15, 2007

Posted by: Tim Bass


From a commercialization perspective, Mark Palmer and John Bates of Progress Apama have done a excellent job of beginning to document the history of CEP.

One of the biggest problems prior to commercialization of event processing engines was that each event processing application was yet-another hand-made custom implementation.

For example, in the military, event processing history goes back many years prior to the (commercial) history that Mark and John kindly offer, and there are a number of books on the topic, and in particular, the sensor fusion area, where processing distributed sensor data is the same as processing events, the events just happen to be sensor data. The implementations were (and still are) expensive, custom built event processing applications.

Also, event processing was also a big part of the heritage of large-scale modelling and simulations, and much of the genesis of David Luckham’s DARPA funded Rapide and CEP work follows that school
of thought. Similarly, prior to the commercialization of CEP and ESP engines, these applications were (and still are) very expensive custom implementions. In many cases today’s CEP engines are not yet advanced enough to be “white boxes” for extremely challenging command-and-control (C2) event processing applications.

One of the reasons is that all of these “extremely complex” event processing applications have a very large challenge, for example, task scheduling between distributed, heterogeneous event sources. Most commercial CEP and ESP engines are still relatively immature in their ability to manage the complexity of distributed computing scheduling tasks.

In addition, when event processing requires specialization, coooperation, distributed object caching and distribution between distributed CEP engines (event processing agents) another layer of scheduling complexity arises.

So, while we would all agree that we have come a long way in the commercialization of CEP and ESP engines, we would also agree that we have a long way to go as well.

Copyright © 2007 by Tim Bass, All Rights Reserved.


What is Complex Event Processing? (Part 6)

May 15, 2007

Posted by: Tim Bass

In What is Complex Event Processing? (Part 5), we discussed situation refinement, the functional component of event processing that describes refining multiple event objects in order to estimate and identify business situations and scenarios in real-time. Today, in Part 6 of What is Complex Event Processing, we discuss impact assessment – where detected business situations are compared, correlated, and/or analyzed in “what if” type of scenarios to determine and predict business consequences.

Event Processing Reference Architecture

After event detection and situation refinement, businesses are very concerned with ascertaining or predicting outcomes and financial gains or losses if a detected situational threat or opportunity materializes. Impact assessment is the functional component of event processing that is focused on the estimation and prediction of the priority, utility or cost of an estimated business situation, complex event or scenario.

At this stage of the CEP reference model (above), we estimate the impact of an assessed situation, which includes likelihood and/or cost/utility measures associated with potential outcomes. From this inference, loss projections and liabilities (or gains) may> In addition, resource allocation and processing priorities may be estimated.

Opportunities and threats in business generally need to be predicted based upon an estimate of the current situation, known plans and predicted reactions. Example of real-time predictive types of business use cases are:

– determining the expected consequences of a fraudsters actions in an ecommerce scenario given the current estimated threat to the business;

– estimate the consequence of a failure in a distributed computing application and the effects on other systems that interact with the failed component;

 

– estimating the potential profit if an algorithmic trade is executed on a tracked equity or basket of equities;

 

– predicting how delays in shipping effect the supply chain, including consumer choices and behavior;

– predicting network congestion and outages based on specific patterns of anomalous network behavior in real-time;

 

– assessing risk and losses in a potential aircraft collision based on information about the planes, the location and their cargo or passengers;

 

– predicting the impact of a viral epidemic on different geographic areas and populations;

 

– predicting costs saving based on optimizing network resources in a transportation or supply chain network; or,

 

– predicting potential losses if an identified class of missile reaches its projected target.

Impact assessment generally requires real-time correlation of historical data which resides in databases. This is represented by the Database Management component of the event processing reference architecture.

In my next post, What Is Complex Event Processing, Part 7, we will discuss another important area in CEP, process refinement – actions taken, parameters adjusted, resources allocated (for example) based on detected (and/or predicted) business situations and scenarios.

Copyright © 2007 by Tim Bass, All Rights Reserved.