BEA Publishes Event Server 2.0 Docs

July 29, 2007

BEA has made their WebLogic® Event Server 2.0 documents available in both HTML and PDF format to the general public.  This is good news for everyone.   Currently there are 6 PDF files:

There is also a large ZIP file with docs:

I will update our Event Processing Language (EPL) Survey and include BEA’s EPL in the near future.

Clouds (Partially Order Sets) – Streams (Linearly Ordered Sets) – Part 2

July 28, 2007

Originally posted in Yahoo! CEP-Interest

Here is my follow-up note on posets (partially ordered sets) and tosets (totally or linearly ordered sets) as background set theory for event processing, and in particular CEP and ESP.

In my last note, we discussed posets and tosets in the context of ESP (tosets) and CEP (posets) and we confirmed, via set theory, that tosets (chains) are a special case of posets with the added property of comparability. Kindly refer to this link for a quick review and this link for my prior post on set theory:

On CEP-Interest we have enjoyed excellent discussions of how many ESP software vendors process tosets (like stock market data for a particular stock) and reorder out-of-order events/transactions. The idea to note here, for our discussion on set theory, is that market data for a particular stock is a toset, ordered by the time of trade execution. If these transactions are then sent to an event processing server, for example a nice ESP server by one of our friends here, the ESP server may order the out-of-order toset, but the stream of market data is a toset, by definition.

Now, in tosets, what are the main properties that define the notion of “order” or “comparability”? Well, there can be many, for example; time, causality, association, taxonomy, ontology, etc. Market data, of course, is generally processed as a toset where the order of execution is the comparable property of the toset. Generally, ESP applications do not look, at this point in time, for the causality behind why one trade was at $9.213 per share and then the next, 20 ms later, was executed at $9.222 per share. Generally, ESP applications, today, run continuous queries across defined sliding time windows of market data and calculate some interesting value, such as VWAP.

On the other hand, CEP defines a more general poset application for event processing, where, for example, the set of events may or may not be linearly ordered in time, yet causality is unknown. In this case, the comparable property of tosets is unknown, because the events may or may not be related or comparable, from a cause-and-effect perspective. One obvious example is the CEP application of fraud detection, where many distributed events happen and we want to determine the cause, and the effect, in real-time. The set of seemingly unrelated events is a poset because the relationship or comparability between all the members of the set of events is (currently) unknown. Dr. Luckham’s seminal CEP work in this area, at Stanford University, was based on debugging distributed systems, or problems in causality.

If we understood all the cause-and-effect relationships in a set of events, and we could determine the order, then we effectively have transformed a poset into a toset, or an “event cloud” into an “event stream”. This is precisely what many classes of CEP applications are designed to do.

Now, to close this note, let us consider the universal set of all events. Let us assume (to make this easier) than the master clock of the universe is so granular, so precise, say to a trillion decimal places (maybe like calculating Pi) that all events in the universe can be ordered in time. If we were only processing all events based on time order, then we would have a stream of events, or a toset.

On the other hand, if we are interested in causality in the universal set of events, then we have a poset, because we do not possess the observation and computing power to deal with such a set of events. So, what do we do?

We create subsets of the universal set, and we process these sets, because we possess this computing and observational capability. For example, we can work on sets of data called “market transactions” and run VWAP across a stream of stock transactions. The set is a toset. When we create subsets of events from the set of distributed network application events, like on-line banking, and look for causes and effects from seemingly incomparable events, now we are operating on a poset. The first application on totally ordered events is what vendors are calling, ESP. The second event processing application, based on processing posets, are what Dr. Luckham (and I and others) refer to as CEP.

Thank you for reading.

Clouds (Partially Order Sets) – Streams (Linearly Ordered Sets) – Part 1

July 28, 2007

Originally posted in Yahoo! CEP-Interest

We read interesting discussions in the CEP and ESP market regarding terms like “clouds” and “streams”. Sometimes we observe folks talking about these terms in context of “processing time”, for example, reordering events as part of computational event processing.

A closer examination of posets and linearly ordered sets leads to an understanding, or formality, which is independent of the computational requirements. Sets, formally, are finite or infinite collections of objects (in our case, events) in which order has no significance, and multiplicity is generally ignored. Relations, in set theory, are any subset of a Cartesian product.

A relation R on a set A is called a partial order if R is reflexive, antisymmetric, and transitive. The set A together with the partial order R is called a partially ordered set or poset, and is denoted (A,R). This is what Dr. Luckham kindly reminds us is the formal definition of an “event cloud”.

If (A, R) is a poset, we say A is totally (linearly) ordered if for all x, y (in the set of) A either xRy or yRx. In this case R is called a total order. A linearly ordered set has the same three properties of posets (reflexive, antisymmetric, and transitive) with the addition of a forth, comparability (the trichotomy law). It is the addition of the property of comparability to posets that creates linearly ordered sets, what Dr. Luckham refers to as “event streams”.

The abstract properties of these events are based on the intrinsic relationships between events in the set, easily observed from the formal definitions of posets and linearly ordered sets. It follows that it can be independently shown, along this line of set theory analysis, that the metaphor “event cloud” is a partially ordered set of events and an “event stream” is a linearly ordered set of events. The different between the two is the forth additional property of comparability (the trichotomy law) applied to posets, creating the special case of posets called linearly or totally ordered sets of events.

Kindly keep in mind that we are discussing the intrinsic nature of events and event relationships (from set theory). For example, the fact that events arrive out-of-order (from a temporal perspective) demonstrates that there is order (it is a linearly ordered set). Also, please keep in mind that the term “relation” in this context can be a taxonomy, causality, partonomy, and/or temporal, for example. In other words, the temporal relationship often discussed in event stream processing is only one of a number of possible interesting relations in posets.

In a future post, I will endeavor to describe the significance of the set relationship principles of “event comparability” and “event incomparability” in the content of ESP (event comparability, event streams) and the general case of CEP (event sets where events are partially ordered and not necessarily comparable).

Bending CEP for Rules

July 27, 2007

I think James Taylor and my good friend Paul Vincent should be mindful not to reduce CEP (accidentally or intentionally) to rule-based systems, and broaden their perspectives and blog entries. In the original work on CEP by Dr. Luckham, the point of CEP is to solve complex problems in many problem domains, many require backwards chaining, uncertainty principles, statistical methods and more. Rule-based systems are interesting and useful, congruent with expert-systems, but also have well documented limitations (see notes below) in the classes of complex problems they can efficiently address.

Both James and Paul have excellent backgrounds in rule-based systems and have worked together in this area; on the other hand, CEP is not simply “rules and events” or “rules with EDA” etc.

Dr. Luckam’s background as a distinquished professor at Stanford was AI, including debugging large-scale distributed systems and performing complex network security research for DARPA. In all of these application areas, there is a known limit to the usefulness of rule-based approaches to address complex classes of decision support systems that require statistical methods to mitigate uncertainty. Rule-based systems are very useful, but they are suboptimal for the challenges of more complex decision support services that are better addressed by statistical and stochastic methods designed for systems with uncertainty – the problem set addressed by Dr. Luckam’s original CEP work.

I enjoy reading James and Paul supporting each other in the area of rules-based approaches to CEP; but I hope the “business rules folks” will keep in mind that CEP was designed to be significantly broader than rule-based decision support.


Reference 1: Rule-based systems are only feasible for problems for which any and all knowledge in the problem domain can be written in the form of if-then rules and for which this problem space is not large.

Reference 2: Abstract: “We demonstrate that classes of dependencies among beliefs held with uncertainty cannot be represented in rule-based systems in a natural or efficient manner. We trace these limitations to a fundamental difference between certain and uncertain reasoning. In particular, we show that beliefs held with certainty are more modular than uncertain beliefs. We argue that the limitations of the rule-based approach for expressing dependencies are a consequence of forcing non-modular knowledge into a representation scheme originally designed to represent modular beliefs….”

Reference 3: “Rule based systems have no ability to automatically learn from their mistakes, nor do they have any way of determining information from their environment. As such, their use is usually limited to very simple problems that have a finite, known set of possible states.”

Reference 4: Broad Google search on limitations of rule-based systems.

CEP Soapbox: Moving from Marketing to Modelling

July 27, 2007

Many of the folks in the event processing community are my dear friends, so I hesitate to stand up on my vulnerable soapbox and speak a piece of my mind. If you would be so kind to permit me to get on my soapbox without naming names or providing links, that would be much appreciated.

I read papers and blogs from friends and colleagues on topics such as CEP and EDA, CEP and BPM, CEP and SOA, CEM and business rules, and so forth, and so on; marketing and positioning CEP without substance. Making matters more noteworthy is the blogosphere marketing where folks trackback or reference each other’s positioning and marketing comments, positioning CEP as everything under the sun (EDA, SOA, BPM), and missing the point of what CEP actually is!

Whoa ….

If you happened to read my last post on an “event cloud generator” or followed the discussion I started in the CEP-Interest group on Yahoo!, you can see that the community has yet to agree on what is an event cloud, much less generate one! So, we all read positioning after positioning statements on CEP as EDA, as SOA, as BPM, as BI, as decision support, as rules, and how great it is; but we can’t yet model and simulate these “complex events” everyone is all taking about.


Perhaps this is why CEP is influenced by something I now call, positioning (or marketing) reductionism; reducing the promise of CEP to something legacy, like BPM or business rules; or something trendy, like SOA. Maybe this is why we still have good and very capable people writing papers on why events are important, but we don’t have user friendly modelling and simulation tools to help people understand how to use CEP.

Anyway, the sun is setting over the beautiful Chao Phraya river in Bangkok as I type to you from the 26th floor of the Royal Orchid Sheraton. Maybe you are thinking “why would anyone be standing on their CEP soapbox from such a beautiful place?”

An Event Cloud Generator for CEP Testing

July 24, 2007

Alexander Widder (Centrum für Informations-Technologie Transfer GmbH, Germany),  and I have been discussing the need for an event cloud generator that could be used for generating CEP scenarios for testing and evaluation purposes.   For example, integrating  flat files from UNIX/Linux syslog generators  (if one exists, need to check) with an open source ESB like Mule (from MuleSource) might be interesting. 

Another possibility (there are many possibilities, naturally) would be to use open source ESP software, for example Esper, to generate a stream of events that would be published to an open ESB like MuleSource to model and simulate (or generate) an event cloud. 

We all read a lot of posts, marketing material and blog entries about numerous CEP and ESP topics related to event processing; but we don’t have an open, standards based CEP event generator that can be used to model and simulate CEP scenarios.   I think it is time for a group of “vendor neutral” folks to work together in this important area.  Anyone interested?  Please comment.

CEP with 40 Gigabits Per Second ?

July 20, 2007

In the early 90’s while consulting at Sprint, I worked with Peter Lothberg, who is, without a doubt, one of the world’s top networking experts.   I remember Peter as a super-genius who debugged Cisco router software live on the major Internet routing exchanges back then.   Those days were quite controversial (and political), and some of the commercial routing polices of the good ole’ days were the motivation for my 1997 IEEE paper on exterior gateway routing protocols.

Today,  I saw this article about Peter’s mother,  Swedish woman gets superfast Internet – Peter has connected her to the Internet at 40 Gigabits per second, her first Internet connection!  I know my Mom would like to connect to the Internet at this speed if she could filter out all the spam, pop-ups, and other malware that makes life on the net really exhausting at times.  In the words of my dear sweet Mom, who connected to the Internet 14 years before Peter’s mother, “The Internet Ain’t No Fun Anymore!”   

This is an area where CEP can help, and perhaps, someday, make the Internet a safer, cleaner, and fun place for folks!    Bayesian filtering now dominates anti-spam technology; and I am amazed at the accuracy of Google’s GMail spam filters.  Bayesian techniques are also very dominate in fraud detection and other CEP-related solutions domains, including Bayesian diagnostic applications

The state-of-the-art (and promise) of CEP is realized when distributed applications are processed by high speed, low latency networks.   Imagine the complex processing we can do when event processing agents (EPAs) are cooperating at 40 gigabits per second!    

On the other hand, think of the challenges of trying to filter out all the spam, malware, and other malicious code flying at your Mom, or your children, at 40 Gigabits per second!    This is one of the biggest challenges in cyberspace, a virtual world where information is transmitted globally near lightspeed, creating both opportunities and threats; and one of the reasons I remain excited about evolving CEP to solve some of our most challenging event processing problems in the future.