Marc Adler: Analytics are an Integral Part of the CEP Stack

June 29, 2008

In Recent Buyouts, Marc Adler of Citigroup blogs “Despite what the various pundits of the CEP world say, I still think that analytics are an integral part of the CEP stack.”

Mark also says something else I agree with, “… [TIBCO] Business Events [ … is …] a more workflow-oriented product, something that you would NOT use to pump Level2 quotes through and create algo apps.”

Kudos to Marc!  Very insightful. Keep on blogging!


ICT Cmte: Thailand’s Cyber Law Compliance Seminar

June 12, 2008

ICT Cmte: Thailand’s Cyber Law Compliance Seminar

American Chamber of Commerce in Thailand

Date & Time: 17-Jun-2008

Details: This month You are invited to attend a Computer Crime Act Compliance Seminar. Find out what the Thai “Cyber Law” requires, when it will start to be enforced and how you can comply. If your business or hotel offers Internet access to customers, employees or end users, this will be a practical session for you to gain a better understanding of the Thai Computer Crime Act.


The Predictive Battlespace

June 11, 2008

Friend and colleague Don Adams, CTO World Wide Public Sector, TIBCO Software, explains how CEP can be used to sense, adapt and respond to complex situations in The “Predictive” Battlespace: Leveraging the Power of Event-Driven Architecture in Defense


Clouding and Confusing the CEP Community

April 20, 2008

Ironically, our favorite software vendors have decided, in a nutshell, to redefine Dr. David Luckham’s definition of “event cloud” to match the lack-of-capabilities in their products.  

This is really funny, if you think about it. 

The definition of “event cloud” was coordinated over a long (over two year) period with the leading vendors in the event processing community and is based on the same concepts in David’s book, The Power of Events. 

But, since the stream-processing oriented vendors do not yet have the analytical capability to discover unknown causal relationship in contextually complex data sets, they have chosen to reduce and redefine the term “event cloud” to match their product’s lack-of-capability.  Why not simply admit they can only process a subdomain of the CEP space as defined by both Dr. Luckham and the CEP community-at-large? 

What’s the big deal?   Stream processing is a perfectly respectable profession!

David, along with the “event processing community,” defined the term “event cloud” as follows:

Event cloud: a partially ordered set of events (poset), either bounded or unbounded, where the partial orderings are imposed by the causal, timing and other relationships between the events.

Notes: Typically an event cloud is created by the events produced by one or more distributed systems. An event cloud may contain many event types, event streams and event channels. The difference between a cloud and a stream is that there is no event relationship that totally orders the events in a cloud. A stream is a cloud, but the converse is not necessarily true.

Note: CEP usually refers to event processing that assumes an event cloud as input, and thereby can make no assumptions about the arrival order of events.

Oddly enough, quite a few event processing vendors seem to have succeeded at confusing their customers, as evident in this post, Abstracting the CEP Engine, where a customer has seemingly been convinced by the (disinformational) marketing pitches  – “there are no clouds of events, only ordered streams.”

I think the problem is that folks are not comfortable with uncertainty and hidden causal relationships, so they give the standard “let’s run a calculation over a stream” example and state “that is all there is…” confusing the customers who know there is more to solving complex event processing problems.

So, let’s make this simple (we hope). referencing the invited keynote at DEBS 2007, Mythbusters: Event Stream Processing Versus Complex Event Processing.

In a nutshell…. (these examples are in the PDF above, BTW)

The set of market data from Citigroup (C) is an example of multiple “event streams.”

The set of all events that influence the NASDAQ is an “event cloud”.

Why?

Because a stream  of market data is a linear ordered set of data related by the timestamp of each transaction linked (relatively speaking) in context because it is Citigroup market data.    So, event processing software can process a stream of market data, perform a VWAP if they chose, and estimate a good time to enter and exit the market.  This is “good”.

However, the same software, at this point in time, cannot process many market data feeds in NASDAQ and provide a reasonable estimate of why the market moved a certain direction based on a statistical analysis of a large set of event data where the cause-and-effect features (in this case, relationships) are difficult to extract.  (BTW, this is generally called “feature extraction” in the scientific community.)

Why?

Because the current-state-of-the-art of stream-processing oriented event processing software do not perform the required backwards chaining to infer causality from large sets of data where causality is unknown, undiscovered and uncertain.

Forward chaining, continuous query, time series analytics across sliding time windows of streaming data can only perform a subset of the overall CEP domain as defined by Dr. Luckham et al.

It is really that simple.   Why cloud and confuse the community?

We like forward chaining using continuous queries and time series analysis across sliding time windows of streaming data. 

  • There is nothing dishonorable about forward chaining using continuous queries and time series analysis across sliding time windows of streaming data.   
  • There is nothing wrong with forward chaining using continuous queries and time series analysis across sliding time windows of streaming data. 
  • There is nothing embarrassing about forward chaining using continuous queries and time series analysis across sliding time windows of streaming data. 

Forward chaining using continuous queries and time series analysis across sliding time windows of streaming data is a subset of the CEP space, just like the definition above, repeated below:

The difference between a cloud and a stream is that there is no event relationship that totally orders the events in a cloud. A stream is a cloud, but the converse is not necessarily true.

It is really simple.   Why cloud a concept so simple and so accurate?


Threats to the Democratic Process

April 13, 2008

Our readers might recall that this post by Tim Bass, The Top Ten Cybersecurity Threats for 2008.  One of the top ten threats to cybersecurity in 2008, according to this post, was:

    — Subversion of democratic political processes.

Interestingly enough, Electoral-Vote.com, a site maintained by Dr. Andrew Tanenbaum, Professor of Computer Science at the Vrige University in Amsterdam, reports (a link to a news story) that the US presidental election can be hacked.

This is not science fiction folks, it is simply the political and social realities of our brave new electronic world.


Please Welcome Dr. Rainer von Ammon to The CEP Blog

February 12, 2008

Today is an especially joyful occasion on The CEP Blog.    I am pleased to announce that one of the world’s top experts on CEP, Dr. Rainer von Ammon, has joined the blog.

Dr. Rainer von Ammon is managing director of the Centrum für Informations-Technology Transfer (CITT) in Regensburg. Until October 2005 he was Professor for Software Engineering, specializing in E-Business infrastructures and distributed systems, at the University of Applied Sciences Upper Austria. Rainer is still teaching there and at the University of Applied Sciences of Regensburg. From 1998 to 2002, he worked as Principal Consultant and Manager for R+D Cooperations at BEA Systems (Central and Eastern Europe). Prior to this, he was Professor for Software Engineering in Dresden with a focus on development of applications with event driven object oriented user interfaces and component based application development. Before this Rainer was acting as manager of the field Basic Systems at the Mummert + Partner Unternehmensberatung, Hamburg. After finishing his studies of Information Sciences at the University of Regensburg, he started as project leader of Computer Based Office Systems (COBIS) from 1978 to 1983 and afterward founded a start up company with some of his colleagues.

Some of you may recall my recent musings, A Bitter Pill To Swallow: First Generation CEP Software Needs To Evolve.   When you read Rainer’s excellent reply, you will quickly see why we are very pleased to have his thought leadership here at The CEP Blog.  Dr. von Ammon and his team are leading experts in CEP and related business integration domains.  Not only does he provide thought leadership, his team  researches, develops, implements and tests CEP solutions.   

In another example of  his thought leadership, some of you might recall this post, Brandl and Guschakowski Deliver Excellent CEP/BAM Report, where Hans-Martin Brandl and David Guschakowski of the University of Applied Sciences Regensburg, Faculty of Information Technology/Mathematics, advised by Dr. von Ammon, completed an excellent CEP thesis, Complex Event Processing in the context of Business Activity Monitoring

Please join me in extending a warm welcome for Dr. Rainer von Ammon to The CEP Blog.


A Bitter Pill To Swallow: First Generation CEP Software Needs To Evolve

February 8, 2008

Frankly speaking, the CEP market is now saturated with hype about all the great things CEP can do, detecting opportunities and threats in real time and supporting the decision cycle.  However, in my opinion, it is time for the software vendors and analysts to move beyond the marketing hype and demonstrate real operational value with strong end user success, something seriously lacking today.

I have advocated this evolution for two years, including the notion of expanding CEP capabilities with proven techniques for event processing that have worked well long before current “Not yet CEP but called CEP” software hit the marketplace and airwaves.

For example, in my first CEP/EP presentation in New York in 1Q 2006, I presented Processing Patterns for Predictive Business and talked about how the US military has implemented high performance detection-oriented systems for many years (in the art-and-science of multisensor data fusion, MSDF), and how every day, when we sit at home (or at work or in transit), we are comforted to know we are safe from missile attacks because of what I would also call “complex event processing.”   There is a very rich history of “CEP but not called CEP” behind the scenes keeping people safe and warm. (The same thing can be said with many similar examples of complex event processing in use today, but not called “CEP” by CEP software vendors.)

This is one reason, when I read the “CEP history lessons,” I am amused at how, at times, the lessons appear self-serving, not end user serving.  There is so much rich event processing history and proven architectures in “CEP but not called CEP” (CEP that actually works, in practice everyday, long before it was called CEP).  It continues to puzzle me that a few people the CEP/EP community continue to take the “we invented EP” view.  Quite frankly, the history we read is missing most, if not all, of the history and practice of MSDF.

When we take the current CEP COTS software offerings and apply it to these working “CEP but not called CEP” applications, the folks with real operational “CEP but not called CEP” detection-oriented experience quickly cut through the hype because they are, based on their state-of-the-practice, now seeking self-learning, self-healing “real CEP type” systems.  They are not so excited about first generation technologies full of promises from software vendors with only a few years of experience in solving detection-oriented problems and very few real success stories.

The same is true for advanced fraud detection and other state-of-the-art detection-oriented processing of “complex events” and situations.  The state-of-the-art of complex event processing, in practice, is far beyond the first generation CEP engines on the market today. 

This is one of the reasons I have agreed with the IBM folks who are calling these first generation “CEP orchestration engines” BEP engines, because that view is closer to fact than fiction.  Frankly speaking again, process orchestration is much easier than complex detection with high situation detection confidence and also low false alarms.

Customers who are detection-savvy also know this, and I have blogged about a few of these meetings and customer concerns.  For example, please read my blog entry about a banker who was very sceptical in a recent wealth management conference in Bangkok.  I see this reaction all the time, in practice. 

Complex problems are not new and they still cry out for solutions.  Furthermore, many current-generation event processing solutions are already more advanced that the first generation CEP engines on the “call it CEP” market today.  This is a very real inhibitor, in my opinion, to growth in the “call it CEP” software space today – and credibility may ultimately be “at risk.”  Caution is advised.

Candidly speaking again, there are too many red-herring CEP-related discussions and not enough solid results given the time software vendors have been promoting CEP/EP (again, this is simply my opinion).  The market is in danger of eventually losing credibility, at least in the circles I travel and complex problems I enjoy solving, because the capabilities of the (so called) CEP technologies by software vendors in the (so called) CEP space have been over sold; and, frankly speaking, I have yet to see tangible proof of “real CEP capabilities” in the road maps and plans of the current CEP software vendors.  This is dissappointing.

This pill is bitter and difficult to swallow, but most of my life’s work has been advising, formulating and architecting real-time solutions for the end user (the C-level executives and the operational experts with the complex problems to solve).   CEP software must evolve and there needs to be more tangible results, not more marketing hype.