TIBCO Leaps Ahead in CEP with Insightful Acquisition

June 24, 2008

TIBCO Software shows, yet again, why the team in Palo Alto far outpaces the rest of the field with their announced acquisition of Insightful.  

Everyone who follows The CEP Blog and my vision for the business use of CEP understands how much energy and passion I have put into explaining why the crude time-series analysis of streaming data cannot possibly solve the vast majority of complex business problems CEP must address. 

TIBCO’s acquisition of Insightful shows just how serious TIBCO is about working to make the vision of “Predictive Business” a reality.    TIBCO means business, and a large part of what that means is helping customers solve their most challenging business integration problems, which can be summarized in CEP-speak as detecting opportunities and threats, in near real-time, as a core corporate competency. 

If you spend a few moments on the Insightful web site, you will find a treasure of documentation that discusses a gold mine of advanced statistical analytics that can be used in a number of mission critical applications.

This is the class of analytics that form the backbone of complex event processing.  In fact, as I have often pointed out (to the dismay of some of my CEP colleagues), any software company that discusses CEP and does not support or advocate advanced analytics are selling snake oil.      TIBCO obviously understands the difference between snake oil, smoke-and-mirrors marketing, and the technology it takes to solve real operational problems.

My hats off and warm congratulations to the team in Palo Alto for demonstrating, yet again, why TIBCO is committed to solving real customer problems with realistic solutions.

Maybe TIBCO will evolve to mean “The Insightful Business Company”   versus the tired and stale “The Information Bus Company” of yesteryears?

Disclaimer:  I have not been an employee of TIBCO for over a year. 


Capital Market CEP Fantasy Land

June 23, 2008

In Tech Spending Hit by Subprime Mess, Jeffery Schwartz says,

“According to Tabb, spending on development is being refocused on projects that can help firms improve their margins and, not surprisingly, do a better job at risk management. As such, investments in capabilities such as algorithmic trading and complex event processing (CEP) are likely to be pivotal in some firms’ efforts to become more competitive and improve their efforts at mitigating risks.”

“But for some banks that have deployed such technologies — the now-defunct Bear Stearns, Lehman Brothers, Citigroup and Merrill Lynch — the question is: How did these companies fail to mitigate the risks that have slammed their businesses if their development teams were developing and deploying sophisticated systems?

“There is definitely an awareness that perhaps the systems that existed in place to assess the value of portfolios or judge risk [are being scrutinized],” said Stevan Vidich, an industry architect in Microsoft’s financial services group. “

He added that there is strong interest in CEP and other risk management methodologies. A growing number of shops have started deploying such solutions based on the .NET Framework, Vidich said, and he believes such investments will continue.

“Clearly, there’s a lot of need to deal with the immense influx of data and being able to analyze data in a timely manner,” Vidich said. “It also drives need for systems like business intelligence, or BI, applied to a near-real-time scenario, which is a very attractive proposition.”

What are these guys on Wall Street smoking? 

This is the precise “over hyping” problem I have warned about repeatedly.   Folks selling rule engines that perform basic calculations over a time window of streaming data have been marketing their wares as “superbrains” that can solve very complicated problems and, at the same time, save Wall Street and The Planet.

Let me be perfectly clear here Wall Street.  Listen very carefully.

There is nothing in any of the so called CEP products in the market place that is going to stop losses related to the subprime meltdown effecting the “now-defunct Bear Stearns, Lehman Brothers, Citigroup and Merrill Lynch,” as Jeffery Schwartz implies.

To imply that the risk management (and corporate governance) required to mitigate the current crisis on Wall Street can be foreseen, solved, or even mitigated, by a rules engine (or any software) is complete and absolute fantasy.   

I think the fever created by the subprime flu is putting folks on Wall Street, or at least the vendors and the analysts pandering to them, in a Capital Market CEP Fantasy Land.

 


Algo Trading on Your BlackBerry?

June 5, 2008

In E*Trade (ETFC) seizes control of the BlackBerry, Douglas A. McIntyre asks “How long will it be before the Blackberry and other devices will have enough intelligence to trade stocks without their owner’s permission?”

Check out E*Trade’s Mobile*Pro where customers can view all their accounts and watch lists; trades stocks and options;  view streaming quotes, orders, and breaking news; monitor their portfolio and view account alerts; stay on top of markets as well as transfer money between accounts – all from their BlackBerry!

I can’t want to run a next generation algo trading engine on my BlackBerry, or perhaps configure my personal algo engine embedded in my E*Trade account!

Check out the Mobile*Pro demo.


On the Maturity of CEP

May 31, 2008

Deciphering the Myths Around Complex Event Processing  by Ivy Schmerken stimulated a recent flurry of blog posts about the maturity of CEP, including; Mark Palmer’s CEP Myths: Mature or Not? and Opher Etzion’s On Maturity.

I agree with Ivy.  CEP is not yet a mature technology by any stretch of the imagination.  In fact, I agree with all three of Ivy’s main points about CEP.

In 1998 David C. Luckham and Brian Frasca published a paper, Complex Event Processing in Distributed Systems on a new technology called complex event processing, or CEP (Postscript Version).  In that seminal paper on CEP, the authors said, precisely:

“Complex event processing is a new technology for extracting information from message-based systems.”

Ten years later there are niche players, mostly self-proclaimed CEP vendors,  whom do very little in the way of extracting critical, undiscovered, information from message-based, or event-based, systems.  

A handful of these niche players have informally redefined CEP as “performing low latency calculations across streaming market data.”  The calculations they perform are still relatively straight forward and they focus on how to promote white-box algo trading with commercial-off-the-shelf (COTS) software.  In this domain, we might be better off not using the term CEP at all, as this appears to be simply a type of new-fangled COTS algo trading engine.

The real domain of CEP, we thought, was in detecting complex events, sometime referred to as situations, from your digital event-driven infrastructure – the “event soup” for a lack of a better term.    In this domain, CEP, as COTS software, is still relatively immature and the current self-styled COTS CEP software on the market today is not yet tooled to perform complex situational analysis.

This perspective naturally leads to more energy flowing in-and-around the blogosphere, as folks “dumb down” CEP to be redefined as it benefits their marketing strategy, causing more confusion with customers who want CEP capabilties that have zero to do with low latency, high throughput algo trading, streaming market data processing, which maybe we should call “Capital Market Event Stream Processing” or CESP – but wait we don’t really need more acronyms!

Hold on just a minute!  Wasn’t it just a short couple of years ago that folks were arguing that, in capital markets, it was really ESP, not CEP, remember?  Now folks are saying that it is really CEP and that CEP is mature?   

CEP is mature?  CEP is really not ESP?  CEP is really event-driven SOA?  CEP is really real-time BI?  CEP is really low latency, high throughput, white-box COTs algo trading?  CEP is really not a type of BPM?  CEP is not really for detecting complex events?   Complex does not really  mean complex? 

Come on guys, give us a break! 

(Anyway, no one is going to give us a break….  so stay tuned!)

  


Clouding and Confusing the CEP Community

April 20, 2008

Ironically, our favorite software vendors have decided, in a nutshell, to redefine Dr. David Luckham’s definition of “event cloud” to match the lack-of-capabilities in their products.  

This is really funny, if you think about it. 

The definition of “event cloud” was coordinated over a long (over two year) period with the leading vendors in the event processing community and is based on the same concepts in David’s book, The Power of Events. 

But, since the stream-processing oriented vendors do not yet have the analytical capability to discover unknown causal relationship in contextually complex data sets, they have chosen to reduce and redefine the term “event cloud” to match their product’s lack-of-capability.  Why not simply admit they can only process a subdomain of the CEP space as defined by both Dr. Luckham and the CEP community-at-large? 

What’s the big deal?   Stream processing is a perfectly respectable profession!

David, along with the “event processing community,” defined the term “event cloud” as follows:

Event cloud: a partially ordered set of events (poset), either bounded or unbounded, where the partial orderings are imposed by the causal, timing and other relationships between the events.

Notes: Typically an event cloud is created by the events produced by one or more distributed systems. An event cloud may contain many event types, event streams and event channels. The difference between a cloud and a stream is that there is no event relationship that totally orders the events in a cloud. A stream is a cloud, but the converse is not necessarily true.

Note: CEP usually refers to event processing that assumes an event cloud as input, and thereby can make no assumptions about the arrival order of events.

Oddly enough, quite a few event processing vendors seem to have succeeded at confusing their customers, as evident in this post, Abstracting the CEP Engine, where a customer has seemingly been convinced by the (disinformational) marketing pitches  – “there are no clouds of events, only ordered streams.”

I think the problem is that folks are not comfortable with uncertainty and hidden causal relationships, so they give the standard “let’s run a calculation over a stream” example and state “that is all there is…” confusing the customers who know there is more to solving complex event processing problems.

So, let’s make this simple (we hope). referencing the invited keynote at DEBS 2007, Mythbusters: Event Stream Processing Versus Complex Event Processing.

In a nutshell…. (these examples are in the PDF above, BTW)

The set of market data from Citigroup (C) is an example of multiple “event streams.”

The set of all events that influence the NASDAQ is an “event cloud”.

Why?

Because a stream  of market data is a linear ordered set of data related by the timestamp of each transaction linked (relatively speaking) in context because it is Citigroup market data.    So, event processing software can process a stream of market data, perform a VWAP if they chose, and estimate a good time to enter and exit the market.  This is “good”.

However, the same software, at this point in time, cannot process many market data feeds in NASDAQ and provide a reasonable estimate of why the market moved a certain direction based on a statistical analysis of a large set of event data where the cause-and-effect features (in this case, relationships) are difficult to extract.  (BTW, this is generally called “feature extraction” in the scientific community.)

Why?

Because the current-state-of-the-art of stream-processing oriented event processing software do not perform the required backwards chaining to infer causality from large sets of data where causality is unknown, undiscovered and uncertain.

Forward chaining, continuous query, time series analytics across sliding time windows of streaming data can only perform a subset of the overall CEP domain as defined by Dr. Luckham et al.

It is really that simple.   Why cloud and confuse the community?

We like forward chaining using continuous queries and time series analysis across sliding time windows of streaming data. 

  • There is nothing dishonorable about forward chaining using continuous queries and time series analysis across sliding time windows of streaming data.   
  • There is nothing wrong with forward chaining using continuous queries and time series analysis across sliding time windows of streaming data. 
  • There is nothing embarrassing about forward chaining using continuous queries and time series analysis across sliding time windows of streaming data. 

Forward chaining using continuous queries and time series analysis across sliding time windows of streaming data is a subset of the CEP space, just like the definition above, repeated below:

The difference between a cloud and a stream is that there is no event relationship that totally orders the events in a cloud. A stream is a cloud, but the converse is not necessarily true.

It is really simple.   Why cloud a concept so simple and so accurate?


On Time-Series Analysis with Strict Determinism

March 29, 2008

Like the predictable ebb and flow of ocean tides, we see the rise, falling and passing away of lively debates about event processing languages (EPLs).   For example, you might recall that Louis Lovas, Progress Apama,  did an excellent job in his post, Bending the Nail, where he described why SQL or Extended SQL is not the optimal EPL for event processing.  

A few of us in the “SQL is not necessarily the best EPL” choir started singing with Louis which motivated a counter voice the choir with the post, Fair and unfair criticism of an SQL EP approach only to have the same author counter that post with, One down side to an SQL EP approach.   

Many technologists, including some of my team members at Techrotech, enjoy focusing on linear event processing problems with strict determinism, for example, processing a stream of market data and looking for opportunities to enter or exit the market (algo trading).    These same technologists tend champion event processing problems that are basic transformations of simple streams of time-series data.  

Many of the so-called CEP cybertrading examples (I would argue that these are simple event processing, not complex event processing examples) are not rooted in event processing scenarios that require looking for causal linkages between seemingly unrelated events; for example, debugging complex distributed systems or detecting fraud over long periods of time where sliding time windows on continuous streaming data are only a part of the solution in the uncertain world of  cloudy event-causality relationships.

Time-series analysis with strict determinism are interesting, but I would not go so far at to call this processing “complex event processing” relative to the myriad challenging complex problems in the real-world.


Please Welcome Dr. Rainer von Ammon to The CEP Blog

February 12, 2008

Today is an especially joyful occasion on The CEP Blog.    I am pleased to announce that one of the world’s top experts on CEP, Dr. Rainer von Ammon, has joined the blog.

Dr. Rainer von Ammon is managing director of the Centrum für Informations-Technology Transfer (CITT) in Regensburg. Until October 2005 he was Professor for Software Engineering, specializing in E-Business infrastructures and distributed systems, at the University of Applied Sciences Upper Austria. Rainer is still teaching there and at the University of Applied Sciences of Regensburg. From 1998 to 2002, he worked as Principal Consultant and Manager for R+D Cooperations at BEA Systems (Central and Eastern Europe). Prior to this, he was Professor for Software Engineering in Dresden with a focus on development of applications with event driven object oriented user interfaces and component based application development. Before this Rainer was acting as manager of the field Basic Systems at the Mummert + Partner Unternehmensberatung, Hamburg. After finishing his studies of Information Sciences at the University of Regensburg, he started as project leader of Computer Based Office Systems (COBIS) from 1978 to 1983 and afterward founded a start up company with some of his colleagues.

Some of you may recall my recent musings, A Bitter Pill To Swallow: First Generation CEP Software Needs To Evolve.   When you read Rainer’s excellent reply, you will quickly see why we are very pleased to have his thought leadership here at The CEP Blog.  Dr. von Ammon and his team are leading experts in CEP and related business integration domains.  Not only does he provide thought leadership, his team  researches, develops, implements and tests CEP solutions.   

In another example of  his thought leadership, some of you might recall this post, Brandl and Guschakowski Deliver Excellent CEP/BAM Report, where Hans-Martin Brandl and David Guschakowski of the University of Applied Sciences Regensburg, Faculty of Information Technology/Mathematics, advised by Dr. von Ammon, completed an excellent CEP thesis, Complex Event Processing in the context of Business Activity Monitoring

Please join me in extending a warm welcome for Dr. Rainer von Ammon to The CEP Blog.