Marc Adler: Analytics are an Integral Part of the CEP Stack

June 29, 2008

In Recent Buyouts, Marc Adler of Citigroup blogs “Despite what the various pundits of the CEP world say, I still think that analytics are an integral part of the CEP stack.”

Mark also says something else I agree with, “… [TIBCO] Business Events [ … is …] a more workflow-oriented product, something that you would NOT use to pump Level2 quotes through and create algo apps.”

Kudos to Marc!  Very insightful. Keep on blogging!

TIBCO Leaps Ahead in CEP with Insightful Acquisition

June 24, 2008

TIBCO Software shows, yet again, why the team in Palo Alto far outpaces the rest of the field with their announced acquisition of Insightful.  

Everyone who follows The CEP Blog and my vision for the business use of CEP understands how much energy and passion I have put into explaining why the crude time-series analysis of streaming data cannot possibly solve the vast majority of complex business problems CEP must address. 

TIBCO’s acquisition of Insightful shows just how serious TIBCO is about working to make the vision of “Predictive Business” a reality.    TIBCO means business, and a large part of what that means is helping customers solve their most challenging business integration problems, which can be summarized in CEP-speak as detecting opportunities and threats, in near real-time, as a core corporate competency. 

If you spend a few moments on the Insightful web site, you will find a treasure of documentation that discusses a gold mine of advanced statistical analytics that can be used in a number of mission critical applications.

This is the class of analytics that form the backbone of complex event processing.  In fact, as I have often pointed out (to the dismay of some of my CEP colleagues), any software company that discusses CEP and does not support or advocate advanced analytics are selling snake oil.      TIBCO obviously understands the difference between snake oil, smoke-and-mirrors marketing, and the technology it takes to solve real operational problems.

My hats off and warm congratulations to the team in Palo Alto for demonstrating, yet again, why TIBCO is committed to solving real customer problems with realistic solutions.

Maybe TIBCO will evolve to mean “The Insightful Business Company”   versus the tired and stale “The Information Bus Company” of yesteryears?

Disclaimer:  I have not been an employee of TIBCO for over a year. 

The Predictive Battlespace

June 11, 2008

Friend and colleague Don Adams, CTO World Wide Public Sector, TIBCO Software, explains how CEP can be used to sense, adapt and respond to complex situations in The “Predictive” Battlespace: Leveraging the Power of Event-Driven Architecture in Defense

Scheduling Agents with Rules Engines

April 5, 2008

Paul Vincent of TIBCO talks about agents in his post, CEP and Agents…

At the core, TIBCO’s BusinessEvents is RETE-based rules engine and rules engines are well suited for scheduling problems.  This makes perfect sense, since many of TIBCO’s customers deploy BusinessEvents in scheduling-oriented, not detection-oriented, solutions.

It begs to be pointed out, however, that scheduling is only one component of a CEP architecture. 

Normally, the scheduling component of a distributed event processing architecture manages the intelligent scheduling of the sharing of data between distributed agents that are running a variety of analytics.

Simply stated, all agents are not rules engines; however, rules engines are often used to schedule the cooperation between analytical agents in a distributed agent-based architecture.

Please Welcome Dr. Rainer von Ammon to The CEP Blog

February 12, 2008

Today is an especially joyful occasion on The CEP Blog.    I am pleased to announce that one of the world’s top experts on CEP, Dr. Rainer von Ammon, has joined the blog.

Dr. Rainer von Ammon is managing director of the Centrum für Informations-Technology Transfer (CITT) in Regensburg. Until October 2005 he was Professor for Software Engineering, specializing in E-Business infrastructures and distributed systems, at the University of Applied Sciences Upper Austria. Rainer is still teaching there and at the University of Applied Sciences of Regensburg. From 1998 to 2002, he worked as Principal Consultant and Manager for R+D Cooperations at BEA Systems (Central and Eastern Europe). Prior to this, he was Professor for Software Engineering in Dresden with a focus on development of applications with event driven object oriented user interfaces and component based application development. Before this Rainer was acting as manager of the field Basic Systems at the Mummert + Partner Unternehmensberatung, Hamburg. After finishing his studies of Information Sciences at the University of Regensburg, he started as project leader of Computer Based Office Systems (COBIS) from 1978 to 1983 and afterward founded a start up company with some of his colleagues.

Some of you may recall my recent musings, A Bitter Pill To Swallow: First Generation CEP Software Needs To Evolve.   When you read Rainer’s excellent reply, you will quickly see why we are very pleased to have his thought leadership here at The CEP Blog.  Dr. von Ammon and his team are leading experts in CEP and related business integration domains.  Not only does he provide thought leadership, his team  researches, develops, implements and tests CEP solutions.   

In another example of  his thought leadership, some of you might recall this post, Brandl and Guschakowski Deliver Excellent CEP/BAM Report, where Hans-Martin Brandl and David Guschakowski of the University of Applied Sciences Regensburg, Faculty of Information Technology/Mathematics, advised by Dr. von Ammon, completed an excellent CEP thesis, Complex Event Processing in the context of Business Activity Monitoring

Please join me in extending a warm welcome for Dr. Rainer von Ammon to The CEP Blog.

A Bitter Pill To Swallow: First Generation CEP Software Needs To Evolve

February 8, 2008

Frankly speaking, the CEP market is now saturated with hype about all the great things CEP can do, detecting opportunities and threats in real time and supporting the decision cycle.  However, in my opinion, it is time for the software vendors and analysts to move beyond the marketing hype and demonstrate real operational value with strong end user success, something seriously lacking today.

I have advocated this evolution for two years, including the notion of expanding CEP capabilities with proven techniques for event processing that have worked well long before current “Not yet CEP but called CEP” software hit the marketplace and airwaves.

For example, in my first CEP/EP presentation in New York in 1Q 2006, I presented Processing Patterns for Predictive Business and talked about how the US military has implemented high performance detection-oriented systems for many years (in the art-and-science of multisensor data fusion, MSDF), and how every day, when we sit at home (or at work or in transit), we are comforted to know we are safe from missile attacks because of what I would also call “complex event processing.”   There is a very rich history of “CEP but not called CEP” behind the scenes keeping people safe and warm. (The same thing can be said with many similar examples of complex event processing in use today, but not called “CEP” by CEP software vendors.)

This is one reason, when I read the “CEP history lessons,” I am amused at how, at times, the lessons appear self-serving, not end user serving.  There is so much rich event processing history and proven architectures in “CEP but not called CEP” (CEP that actually works, in practice everyday, long before it was called CEP).  It continues to puzzle me that a few people the CEP/EP community continue to take the “we invented EP” view.  Quite frankly, the history we read is missing most, if not all, of the history and practice of MSDF.

When we take the current CEP COTS software offerings and apply it to these working “CEP but not called CEP” applications, the folks with real operational “CEP but not called CEP” detection-oriented experience quickly cut through the hype because they are, based on their state-of-the-practice, now seeking self-learning, self-healing “real CEP type” systems.  They are not so excited about first generation technologies full of promises from software vendors with only a few years of experience in solving detection-oriented problems and very few real success stories.

The same is true for advanced fraud detection and other state-of-the-art detection-oriented processing of “complex events” and situations.  The state-of-the-art of complex event processing, in practice, is far beyond the first generation CEP engines on the market today. 

This is one of the reasons I have agreed with the IBM folks who are calling these first generation “CEP orchestration engines” BEP engines, because that view is closer to fact than fiction.  Frankly speaking again, process orchestration is much easier than complex detection with high situation detection confidence and also low false alarms.

Customers who are detection-savvy also know this, and I have blogged about a few of these meetings and customer concerns.  For example, please read my blog entry about a banker who was very sceptical in a recent wealth management conference in Bangkok.  I see this reaction all the time, in practice. 

Complex problems are not new and they still cry out for solutions.  Furthermore, many current-generation event processing solutions are already more advanced that the first generation CEP engines on the “call it CEP” market today.  This is a very real inhibitor, in my opinion, to growth in the “call it CEP” software space today – and credibility may ultimately be “at risk.”  Caution is advised.

Candidly speaking again, there are too many red-herring CEP-related discussions and not enough solid results given the time software vendors have been promoting CEP/EP (again, this is simply my opinion).  The market is in danger of eventually losing credibility, at least in the circles I travel and complex problems I enjoy solving, because the capabilities of the (so called) CEP technologies by software vendors in the (so called) CEP space have been over sold; and, frankly speaking, I have yet to see tangible proof of “real CEP capabilities” in the road maps and plans of the current CEP software vendors.  This is dissappointing.

This pill is bitter and difficult to swallow, but most of my life’s work has been advising, formulating and architecting real-time solutions for the end user (the C-level executives and the operational experts with the complex problems to solve).   CEP software must evolve and there needs to be more tangible results, not more marketing hype.

An Overture to the 2007 CEP Blog Awards

January 9, 2008

Before announcing the winners of the 2007 CEP Blog Awards I thought it would be helpful to introduce the award categories to our readers.

I have given considerable thought to how to structure The CEP Blog Awards. This was not an easy task, as you might imagine, given the confusion in the event processing marketspace. So here goes.

For the 2007 CEP Blog Awards I have created three event processing categories. Here are the categories and a brief description of each one:

The CEP Blog Award for Rule-Based Event Processing

Preface: I was also inclined to call this category “process-based event processing” or “control-based event processing” and might actually do so in the future. As always, your comments and feedback are important and appreciated.

Rule-based (or process-based) event processing is a major subcategory of event processing. Rule-based approaches to event processing are very useful for stateful event-driven process control, track and trace, dynamic resource management and basic pattern detection (see slide 12 of this presentation). Rule-based approaches are optimal for a wide-range of production-related event processing systems.

However, just like any system, there are engineering trade-offs using this approach. Rule-based systems tend not to scale well when the number of rules (facts) are large. Rule-based approaches can also be difficult to manage in a distributed multi-designer environment. Moreover, rule-based approaches are suboptimal for self-learning and tend not to process uncertainty very well. Never the less, rule-based event processing is a very important CEP category.

The CEP Blog Award for Event Stream Processing

Stream-centric approaches to event processing are also a very important overall category of event processing. Unlike a stateful, process-driven rule-based approach, event stream processing optimizes high performance continuous queries over sliding time windows. High performance, low latency event processing is one of the main design goals for many stream processing engines.

Continuous queries over event streams are genenerally designed to be executed in milliseconds, seconds and perhaps a bit longer time intervals. Process-driven event processing, on the other hand, can manage processes, resources, states and patterns over long time intervals, for example, hours and days, not just milliseconds and seconds.

Therefore, event stream processing tends to be optimized for a different set of problems than process-based (which I am calling rule-based this year) event processing. Similar to rule or process-based approaches, most current stream processing engines do not manage or deal with probability, likelihood and uncertainty very well (if at all).

The CEP Blog Award for Advanced Event Processing

For a lack of a better term, I call this category advanced event processing. Advanced event processing will more-than-likely have a rule-based and/or a stream-based event processing component. However, to be categorized as advanced event processing software the software platform must also be able to perform more advanced event processing that can deal with probability, fuzzy logic and/or uncertainty. Event processing software in this category should also have the capability to automatically learn, or be trained, similar to artificial neural networks (ANNs).

Some of my good colleagues might prefer to call this category AI-capable event processing (or intelligent event processing), but I prefer to call this award category advanced event processing for the 2007 awards. If you like the term intelligent event processing, let’s talk about this in 2008!

Ideally, advanced event processing software should have plug-in modules that permit the event processing architect, or systems programmer, to select and configure one or more different analytical methods at design-time. The results from one method should be available to other methods, for example the output of a stream processing module might be the input to a neural network (NN) or Bayesian Belief (BN) module. In another example pipeline operation, the output of a Bayesian classifier could be the input to a process or rule-based event processing module within the same run-time environment.

For all three categories for 2007, there should be a graphical user interface for design-time construction and modeling. There should also be a robust run-time environment and most, if not all, of the other “goodies” that we expect from event processing platforms.

Most importantly, there should be reference customers for the software and the company. The CEP Blog Awards will be only given to companies with a proven and public customer base.

In my next post on this topic, I’ll name the Awardees for 2007. Thank you for standing by. If you have any questions or comments, please contact me directly.