The ART of Event Processing: Agility, Reuse, Transparency

January 18, 2008

The other day I discussed CEP in Layman’s Terms: Reuse and Agility. Today, our topic is CEP and transparency. One of the major benefits of “white box” event processing solutions is transparency, something not readily available or obvious in black-box solutions.

Friend and colleague John Bates, Progress Apama, often discusses the benefits of white-box algorithmic trading platforms in terms of increased time-to-market and other competitive advantages. I agree with John and would like to point out that there is another key benefit, in simple layman’s terms, transparency.

For example, let’s say you have designed an event processing solution for operational risk management (ORM). It is time for your favorite auditors to come by and they wish to take a look at what is going on with that proprietary black-box ORM application running quietly in the server room.

The nice auditors ask you, “What does that application do?” and you reply “Well, it looks for evidence of insider trading,” and they ask “Do you mind if we ask how?” and you respond “Good question, do you mind to wait a moment while I get you the contact info for the vendor because we don’t have access to the source code or the actual key indicators (KIs)?”

Now, let’s look at the white-box scenario:

Again, the nice auditors ask you, “What does that application do?” and you reply “Well, it looks for evidence of insider trading,” and they ask “Do you mind if we ask how?” and you respond “Yes, sit down and we will pull up our insider trading key indicator models. These models are stored in XML format and viewable in our graphical KI design studio. We can print out the KI models for insider trading if you like!” and the smiling auditor says “Thank you, your system is much more transparent than the last place we visited!”

This scenario also applies in looking for why certain KIs were not detected that should have been; or when performing a root cause analysis to see why the KI you used in your wrong business decision was inaccurate.

So, CEP in layman’s terms is what we might refer to as the ART of event processing:

  • Agility
  • Reuse
  • Transparency

Please feel free to reuse these idea, but please don’t forget to reference the author and this blog 🙂

Kindly share and reuse by reference, because all content in The CEP Blog is ©2007-2008 Tim Bass – All Rights Reserved. Thank you!


An Overture to the 2007 CEP Blog Awards

January 9, 2008

Before announcing the winners of the 2007 CEP Blog Awards I thought it would be helpful to introduce the award categories to our readers.

I have given considerable thought to how to structure The CEP Blog Awards. This was not an easy task, as you might imagine, given the confusion in the event processing marketspace. So here goes.

For the 2007 CEP Blog Awards I have created three event processing categories. Here are the categories and a brief description of each one:

The CEP Blog Award for Rule-Based Event Processing

Preface: I was also inclined to call this category “process-based event processing” or “control-based event processing” and might actually do so in the future. As always, your comments and feedback are important and appreciated.

Rule-based (or process-based) event processing is a major subcategory of event processing. Rule-based approaches to event processing are very useful for stateful event-driven process control, track and trace, dynamic resource management and basic pattern detection (see slide 12 of this presentation). Rule-based approaches are optimal for a wide-range of production-related event processing systems.

However, just like any system, there are engineering trade-offs using this approach. Rule-based systems tend not to scale well when the number of rules (facts) are large. Rule-based approaches can also be difficult to manage in a distributed multi-designer environment. Moreover, rule-based approaches are suboptimal for self-learning and tend not to process uncertainty very well. Never the less, rule-based event processing is a very important CEP category.

The CEP Blog Award for Event Stream Processing

Stream-centric approaches to event processing are also a very important overall category of event processing. Unlike a stateful, process-driven rule-based approach, event stream processing optimizes high performance continuous queries over sliding time windows. High performance, low latency event processing is one of the main design goals for many stream processing engines.

Continuous queries over event streams are genenerally designed to be executed in milliseconds, seconds and perhaps a bit longer time intervals. Process-driven event processing, on the other hand, can manage processes, resources, states and patterns over long time intervals, for example, hours and days, not just milliseconds and seconds.

Therefore, event stream processing tends to be optimized for a different set of problems than process-based (which I am calling rule-based this year) event processing. Similar to rule or process-based approaches, most current stream processing engines do not manage or deal with probability, likelihood and uncertainty very well (if at all).

The CEP Blog Award for Advanced Event Processing

For a lack of a better term, I call this category advanced event processing. Advanced event processing will more-than-likely have a rule-based and/or a stream-based event processing component. However, to be categorized as advanced event processing software the software platform must also be able to perform more advanced event processing that can deal with probability, fuzzy logic and/or uncertainty. Event processing software in this category should also have the capability to automatically learn, or be trained, similar to artificial neural networks (ANNs).

Some of my good colleagues might prefer to call this category AI-capable event processing (or intelligent event processing), but I prefer to call this award category advanced event processing for the 2007 awards. If you like the term intelligent event processing, let’s talk about this in 2008!

Ideally, advanced event processing software should have plug-in modules that permit the event processing architect, or systems programmer, to select and configure one or more different analytical methods at design-time. The results from one method should be available to other methods, for example the output of a stream processing module might be the input to a neural network (NN) or Bayesian Belief (BN) module. In another example pipeline operation, the output of a Bayesian classifier could be the input to a process or rule-based event processing module within the same run-time environment.

For all three categories for 2007, there should be a graphical user interface for design-time construction and modeling. There should also be a robust run-time environment and most, if not all, of the other “goodies” that we expect from event processing platforms.

Most importantly, there should be reference customers for the software and the company. The CEP Blog Awards will be only given to companies with a proven and public customer base.

In my next post on this topic, I’ll name the Awardees for 2007. Thank you for standing by. If you have any questions or comments, please contact me directly.


Motor Vehicle Crashes and Complex Event Processing

December 30, 2007

The Research and Innovative Technology Administration (RITA) coordinates Department of Transportation’s (DOT) research programs.  RITA’s mission is to advance the deployment of multi-disciplinary technologies to improve transportation system in the U.S.

Shaw-Pin Miaou, Joon Jin Song and Bani K. Mallick wrote a detailed paper, Roadway Traffic Crash Mapping: A Space-Time Modeling Approach, in RITA’s Journal of Transportation and Statistics.    In their paper, the authors state that, “motor vehicle crashes are complex events involving the interactions of five major factors: drivers, traffic, roads, vehicles, and the environment.”

Maiou, Song and Mallick go on to say that “studies have shown that risk estimation using hierarchical Bayes models has several advantages over estimation using classical methods.”    They also point out that “the overall strength of the Bayesian approach is its ability to structure complicated models, inferential goals, and analyses. Among the hierarchical Bayes methods, three are most popular in disease mapping studies: empirical Bayes (EB), linear Bayes (LB), and full Bayes methods.”

Maiou, Song and Mallick directly reference two important problems that David Luckham recently mentioned during his keynote presentation at the 2007 Gartner Event Processing Symposium, traffic congestion management and global epidemic warning systems.  In addition, Jean Bacon, professor of distributed systems in Cambridge University’s computer laboratory, was recently mentioned in the article Fusing data to manage traffic.

As Maiou, Song and Mallick point out, motor vehicle crashes are complex events requiring the correlation of five situational objects (drivers, traffic, roads, vehicles, and the environment).  Each one of these five situational objects may also be a complex event.  The representation of each object requires complex event processing.

We often see these discussions and articles across the wire, for example, “Is BAM (Business Activity Monitoring) Dead?”  or “Is it CEP or Operational BI (Business Intelligence)?” or “Is it Event-Driven SOA or Just Plain Old SOA?”  Frankly speaking, these debates and discussions are red-herrings.

What is important to solving real problems, indicated by the complex event processing paper by Maiou, Song and Mallick, are real solutions not buzzwords and three letter acronyms.  Please keep this in mind when using the term “complex event processing.”  


OpenCourseWare: Get Smart for Complex Event Processing!

December 30, 2007

Ready to move beyond the basics of event processing?   Perhaps you would like to beef up your Java skills?   The Basics of Signal Processing?  Or maybe you are interested in Advanced Complexity Theory?   Artificial IntelligenceComputer Language EngineeringQueueing Theory?

Well then, put your feet up, relax and click on over to the Department of Electrical Engineering and Computer Science at MIT OpenCourseWare  (OCW) and enjoy their courses, freely available to anyone, anywhere. 

MIT’s OCW program freely shares their lecture notes, exams, and other resources from more than 1800 courses spanning MIT’s entire curriculum, including many fields related to event processing.     There are even RSS feeds on new courses as they hit the wire, so you don’t have to miss a thing!  Also, check out the OSC Consortium.

Complex event processing is a multi-discipline approach for detecting both opportunities and threats in real-time cyberspace.   Make it your New Years Resolution to review a few OCW lectures and help advance the state-of-the-art of CEP!


End Users Should Define the CEP Market.

December 17, 2007

My friend Opher mistakenly thought I was thinking of him when I related the story of the fish, as he replied, CEP and the Story of the Captured Traveller.

I must not have related the fish story very well, because to understood the story of the fish, is to know that we are all like the fish, in certain aspects of life, and there is nothing negative to be gleaned from the story.

However, to Opher’s point on CEP, I disagree.   Just because the marketing people (not the market) has misdefined CEP and therefore the vendors are drifting from the technology described in Dr. Luckham’s original CEP work, including his CEP book, we should not change the context of CEP.    Therefore, I don’t agree we should redefine CEP, as David envisioned, as Intelligent Event Processing (IEP) because CEP, as today’s software vendors sell it, is really SEP (or whatever!)  Please recall that David’s background at Stanford was AI and he did not define CEP as the software vendors have defined it either!

The fact of the matter is that the software marketing folks have decided they are going to use Dr. Luckham’s book to sell software that does not perform as Dr. Luckham described or envisioned!   I make no apologies for being on the side of end users who actually need to solve complex problems, not sell software that underperforms.

As I mentioned, this positioning and repositioning does not help solve complex problems.   At the end of the day, we have problems to solve and the software community is not very helpful when they place form over substance, consistently. 

Furthermore, as most customers are saying, time and time again, “so what?” … “these COTS event processing platforms with simple joins, selects and rules do not solve my complex event processing problems.”  “We already have similar approaches, where we have spent millions of dollars, and they do not work well.”

In other words, the market is crying out for true COTS CEP solutions, but the software community is not yet delivering.  OBTW, this is nothing new.  In my first briefing to the EP community in January of 2006, I mentioned that CEP required stating the business problem, or domain problem, and then selecting the method or methods that best solve the problem or problems.

To date, the CEP community has not done this because they have no COTS tool set other than SEP engines (marketed as either ESP engines or CEP engines – and at least ESP was closer to being technically accurate.) 

Experienced end users are very intelligent. 

These end users know the complex event processing problems they need to solve; and they know the limitations of the current COTS approaches marketed by the CEP community.  Even in Thailand, a country many of you might mistakenly think is not very advanced technologically, there are experts in telecommunications (who run large networks) who are working on very difficult fraud detection applications, and they use neural networks and say the results are very good.   However, there is not one CEP vendor, that I know of, who offers true CEP capability in the form of neural nets.  

Almost every major bank, telco, etc. has the same opinion, and the same problem. They need much more capability than streaming joins, selects and rules to solve their complex event processing problems that Dr. Luckham outlined in his book.   The software vendors are attempting to define the CEP market to match their capability; unfortunately, their capabilities do not meet the requirements of the vast majority of end users who have CEP problems to solve.

If the current CEP platforms were truely solving complex event processing problems, annual sales would be orders of magnitudes higher.  Hence, the users have already voted.   The problem is that the CEP community is not listening.


CEP Center of Excellence for Cybersecurity at Software Park Thailand

December 16, 2007

In July 2007, at InformationSecurityAsia2007,  I unveiled an idea to create a cybersecurity CEP Center of Excellence (COE) in Thailand.  Under the collaborative guidance of Dr. Rom Hiranpruk, Deputy Director, Technology Management Center, National Science and Technology Development Agency (NSTDA), Dr. Prinya Hom-anek, President and Founder, ACIS Professional Center, and Dr. Komain Pipulyarojana, Chief National Security Section, National Electronics and Computer Technology Center (NECTEC), this idea continues to move forward.

Today, in a meeting with Mrs. Suwipa Wanasathop, Director, Software Park Thailand, and her executive team, we reached a tentative agreement to host the CEP COE at Software Park.   

The mission of Software Park Thailand is to be the region’s premier agency supporting entrepreneurs to help create a strong world-class software industry that will enhance the strength and competitiveness of the Thai economy.

Since 2001, Thailand’s software industry has experienced approximately 20% year-over-year (YOY) growth.  Presently, Software Parks Thailand supports a business-technology ecosystem with over 300 active participants employing over 40,000 qualified software engineers across a wide range of technology domains.

I am very pleased that Software Park Thailand is excited about the potential benefits of CEP in the area of cybersecurity and detection-oriented approaches to cyberdefense. The COE will be working with best-of-breed CEP vendors to build, test and refine rule-based (RBS), neural network (NN) based and Bayesian network (BN) based approaches (as well as other detection methods) for cybersecurity.

I will be announcing more details in the future, so stay tuned.  Please feel free to contact me if you have any questions.


The Asia Business Forum: Information Security Risk Assessment and Management (Day One)

December 11, 2007

Today is the opening day of the Information Security Risk Assessment and Management conference in Bangkok.   Mr. Charoon Boonsanong, Lecturer, Faculty of Economics, Chulalongkorn University, open the conference.  

Dr. Komain Pipulyarojana, Chief National Security Section, National Electronics and Computer Technology Center, will lead off with a presentation on the Latest Trends, Standards and Threats for Information Security & Future Direction.   Dr. Komain also serves as the lead for ThaiCERT.    

Police General Yanaphon Youngyuen, Deputy Commissioner, Department of Special Investigation, Royal Thai Police, will present Legal Updates: Interacting with Law Enforcement After a Cyber Crime or Systems Intrusion & Its Impact on Todays Business.

The last presentation before lunch is my presentation, CEP and SOA: An Event-Driven Architecture for Operational Risk Management.   

After lunch, Mr. Phillip Chong, Partner, Enterprise Risk Services, Deloitte Touche Tohmatsu Jaiyos Advisory Company Limited, will talk to us about Governance, Risk Management and Compliance (GRC) as a Model for the Management of Corporate Information.

The last presentation before I must rush off to fight the traffic in Bangkok for a cross-town meeting is, Mr. David Old, Partner, Information Risk Management, KPMG Poomchai Business Advisory Limited.