A Bitter Pill To Swallow: First Generation CEP Software Needs To Evolve

February 8, 2008

Frankly speaking, the CEP market is now saturated with hype about all the great things CEP can do, detecting opportunities and threats in real time and supporting the decision cycle.  However, in my opinion, it is time for the software vendors and analysts to move beyond the marketing hype and demonstrate real operational value with strong end user success, something seriously lacking today.

I have advocated this evolution for two years, including the notion of expanding CEP capabilities with proven techniques for event processing that have worked well long before current “Not yet CEP but called CEP” software hit the marketplace and airwaves.

For example, in my first CEP/EP presentation in New York in 1Q 2006, I presented Processing Patterns for Predictive Business and talked about how the US military has implemented high performance detection-oriented systems for many years (in the art-and-science of multisensor data fusion, MSDF), and how every day, when we sit at home (or at work or in transit), we are comforted to know we are safe from missile attacks because of what I would also call “complex event processing.”   There is a very rich history of “CEP but not called CEP” behind the scenes keeping people safe and warm. (The same thing can be said with many similar examples of complex event processing in use today, but not called “CEP” by CEP software vendors.)

This is one reason, when I read the “CEP history lessons,” I am amused at how, at times, the lessons appear self-serving, not end user serving.  There is so much rich event processing history and proven architectures in “CEP but not called CEP” (CEP that actually works, in practice everyday, long before it was called CEP).  It continues to puzzle me that a few people the CEP/EP community continue to take the “we invented EP” view.  Quite frankly, the history we read is missing most, if not all, of the history and practice of MSDF.

When we take the current CEP COTS software offerings and apply it to these working “CEP but not called CEP” applications, the folks with real operational “CEP but not called CEP” detection-oriented experience quickly cut through the hype because they are, based on their state-of-the-practice, now seeking self-learning, self-healing “real CEP type” systems.  They are not so excited about first generation technologies full of promises from software vendors with only a few years of experience in solving detection-oriented problems and very few real success stories.

The same is true for advanced fraud detection and other state-of-the-art detection-oriented processing of “complex events” and situations.  The state-of-the-art of complex event processing, in practice, is far beyond the first generation CEP engines on the market today. 

This is one of the reasons I have agreed with the IBM folks who are calling these first generation “CEP orchestration engines” BEP engines, because that view is closer to fact than fiction.  Frankly speaking again, process orchestration is much easier than complex detection with high situation detection confidence and also low false alarms.

Customers who are detection-savvy also know this, and I have blogged about a few of these meetings and customer concerns.  For example, please read my blog entry about a banker who was very sceptical in a recent wealth management conference in Bangkok.  I see this reaction all the time, in practice. 

Complex problems are not new and they still cry out for solutions.  Furthermore, many current-generation event processing solutions are already more advanced that the first generation CEP engines on the “call it CEP” market today.  This is a very real inhibitor, in my opinion, to growth in the “call it CEP” software space today – and credibility may ultimately be “at risk.”  Caution is advised.

Candidly speaking again, there are too many red-herring CEP-related discussions and not enough solid results given the time software vendors have been promoting CEP/EP (again, this is simply my opinion).  The market is in danger of eventually losing credibility, at least in the circles I travel and complex problems I enjoy solving, because the capabilities of the (so called) CEP technologies by software vendors in the (so called) CEP space have been over sold; and, frankly speaking, I have yet to see tangible proof of “real CEP capabilities” in the road maps and plans of the current CEP software vendors.  This is dissappointing.

This pill is bitter and difficult to swallow, but most of my life’s work has been advising, formulating and architecting real-time solutions for the end user (the C-level executives and the operational experts with the complex problems to solve).   CEP software must evolve and there needs to be more tangible results, not more marketing hype.

Advertisements

BEP is BEP, CEP is CEP

January 24, 2008

Joe McKendrick, in Taking the ‘complex’ out of complex event processing, makes a case for renaming CEP, BEP.

Joe references IBM’s Sandy Carter, as I did in my post earlier today, IBM Says Business Event Processing is Not CEP.

Joe wants to change the world “complex” to “business” in CEP because he believes the word “complex” is not good for marketing.

The problem with Joe’s approach, as I see it, is that CEP is different than BEP.  However, I remain open-minded on the topic.

There is quite a difference in event-driven orchestration-oriented processing, BEP. and situation detection-oriented event processing, CEP.

BEP is, for the most part, about orchestrating event-driven business processes.

CEP is about detecting opportunities and threats (situations) in real-time.

It is not clear to me that simply renaming BEP CEP touches the core technical and business differences.


Apama: Fraud Detection and Heat Maps

January 2, 2008

A few days ago in Visualization Reloaded I touched upon the subject of heat maps.  In that post the application context was monitoring a massively parallel online gaming platform using a combination of event processing technologies by StreamBase and SL

Today, I was reminded of another heat map created by Progress Apama during a leisurely morning viewing of a Fox Business New video interview with John Bates.  This time the context is the detection of patterns of insider trading.  

Apama Heat Map

In this graphic above (click the image for a larger view) Apama uses a heat map to visualize suspicious trading activity in real time.   Also, you might be interested to know that the cool heat map in this use case is based on the event processing visualization platform by SL Corporation, similar to the heat map in this use case by StreamBase, Simultronics and SL.

Amazingly, in the Fox Business interview John mentions an interesting statistic.   During certain business situations, like mergers and acquisitions, experts have estimated that up to 30 percent of trading activity can be linked to insider trading.   The event processing goal, of course, is to detect fraud sooner than later, minimizing fraudulent market transactions and their influence on the market.


Motor Vehicle Crashes and Complex Event Processing

December 30, 2007

The Research and Innovative Technology Administration (RITA) coordinates Department of Transportation’s (DOT) research programs.  RITA’s mission is to advance the deployment of multi-disciplinary technologies to improve transportation system in the U.S.

Shaw-Pin Miaou, Joon Jin Song and Bani K. Mallick wrote a detailed paper, Roadway Traffic Crash Mapping: A Space-Time Modeling Approach, in RITA’s Journal of Transportation and Statistics.    In their paper, the authors state that, “motor vehicle crashes are complex events involving the interactions of five major factors: drivers, traffic, roads, vehicles, and the environment.”

Maiou, Song and Mallick go on to say that “studies have shown that risk estimation using hierarchical Bayes models has several advantages over estimation using classical methods.”    They also point out that “the overall strength of the Bayesian approach is its ability to structure complicated models, inferential goals, and analyses. Among the hierarchical Bayes methods, three are most popular in disease mapping studies: empirical Bayes (EB), linear Bayes (LB), and full Bayes methods.”

Maiou, Song and Mallick directly reference two important problems that David Luckham recently mentioned during his keynote presentation at the 2007 Gartner Event Processing Symposium, traffic congestion management and global epidemic warning systems.  In addition, Jean Bacon, professor of distributed systems in Cambridge University’s computer laboratory, was recently mentioned in the article Fusing data to manage traffic.

As Maiou, Song and Mallick point out, motor vehicle crashes are complex events requiring the correlation of five situational objects (drivers, traffic, roads, vehicles, and the environment).  Each one of these five situational objects may also be a complex event.  The representation of each object requires complex event processing.

We often see these discussions and articles across the wire, for example, “Is BAM (Business Activity Monitoring) Dead?”  or “Is it CEP or Operational BI (Business Intelligence)?” or “Is it Event-Driven SOA or Just Plain Old SOA?”  Frankly speaking, these debates and discussions are red-herrings.

What is important to solving real problems, indicated by the complex event processing paper by Maiou, Song and Mallick, are real solutions not buzzwords and three letter acronyms.  Please keep this in mind when using the term “complex event processing.”  


OpenCourseWare: Get Smart for Complex Event Processing!

December 30, 2007

Ready to move beyond the basics of event processing?   Perhaps you would like to beef up your Java skills?   The Basics of Signal Processing?  Or maybe you are interested in Advanced Complexity Theory?   Artificial IntelligenceComputer Language EngineeringQueueing Theory?

Well then, put your feet up, relax and click on over to the Department of Electrical Engineering and Computer Science at MIT OpenCourseWare  (OCW) and enjoy their courses, freely available to anyone, anywhere. 

MIT’s OCW program freely shares their lecture notes, exams, and other resources from more than 1800 courses spanning MIT’s entire curriculum, including many fields related to event processing.     There are even RSS feeds on new courses as they hit the wire, so you don’t have to miss a thing!  Also, check out the OSC Consortium.

Complex event processing is a multi-discipline approach for detecting both opportunities and threats in real-time cyberspace.   Make it your New Years Resolution to review a few OCW lectures and help advance the state-of-the-art of CEP!


Adapters and Analytics: COTS? NOT!

December 26, 2007

Marc Adler shows why his musings are rapidly becoming one of my “must read” blogs in his post, CEP Vendors and the Ecosystem.

We have been making similar points in the event processing blogosphere, namely the important of adapters and analytics.   Today, event processing vendors are surprisingly weak in both areas. 

For one thing, there was way much emphasis on rules-based analytics in 2007.  Where are the rest of the plug-and-play commercial analytics end users need for event processing??

And another thing….. 🙂

Why are there so few choices of adapters and why do we have to write our own??

Sometimes I think that if I read another press release on 500,000 events per second I’m going to shout out – the event processing software on the market today cannot even connect to a simple UNIX domain socket out-of-the-box, so how about ZERO events per second!

The bottom line is that the market is still wide open for a software vendor to come to the party with a wide array of plug-and-play, grab-and-go, adapters and analytics.  

Folks are talking COTS, but more often it is NOTS.


Complex Event Processing with Esphion Neural Agents

December 19, 2007

Detection-oriented technologies generally fall into two broad areas, signature-based detection and anomaly-based detection.    Complex event processing (CEP) is also a detection-oriented technology, so we can readily understand that CEP applications must also fall within the same two general areas.

Signature-based detection is sometime referred to as static detection because the technology relies on pre-defined rules, filters, and signatures to match known patterns.  At the most fundamental level, a virus checking program is an example of a signature-based system. 

On the other hand, anomaly-based detection systems strive to maintain a baseline of what is considered normal and then matches patterns outside normal operating parameters, often usings adaptive or artifical intelligence techniques.

Experts know that both anomaly and signature-based detection methods are important and each have their unique challenges and engineering tradeoffs.  For example, signature-based systems tend to generate false negatives because it is not possible to write all possible rules and filters to match every pattern, especially in dynamic real-time environments. Anomaly-based detection, on the other hand, tends to generate false positives because it is quite difficult to create a perfect profile of normal behavior. 

The challenge in most, if not all, detection-oriented systems is finding the right balance between false positives and false negatives.  In some situations, a system should error toward false positives.  In other applications, the system should error toward false negatives. 

CEP is, by defination, a technology to detect both opportunities and threats in distributed networks, in real-time, so it goes without saying that CEP is challenged by the same engineering tradeoffs that affect other detection-oriented systems.

A few weeks ago, I was discussing CEP with a CTO of one of Thailand’s largest telecommunications companies and he was very bullish on neural-based anomaly detection and from Esphion.

First generation detection systems rely on determinism, which is generally rule-based, and known to be insufficient for more complex real-time problems.  Esphion uses neural agents to gathering information on network activity and then creates a unifying situational infrastructure to protect against previously unknown threats.   For example, a fast spreading threat, such as the SQL/Slammer worm, will have reached all possible targets faster than any signature can be published or rule can be written, as mentioned in Worm detection – You need to do it yourself.

Since CEP is designed and marketed as a technology that brings real-time advantages to the detection of both opportunties and threats, we must ask ourselves the question why do all the current CEP software vendors fail to provide non-deterministic methods that are proven to adapt to a rapidly changing world?  

In Anomaly Detection 101, Esphion does a great job of describing how they do not rely on any pre-specified rules, baselines, models, signatures, or any other apriori knowledge.   They claim, and my highly respected telecommunications CTO colleague confirms, that there is no prior knowledge required and their customers are no longer adversely affected by zero-day anomalies or changing network conditions.

The technology behind Esphion does is what I would call complex event processing.