Marc Adler: Analytics are an Integral Part of the CEP Stack

June 29, 2008

In Recent Buyouts, Marc Adler of Citigroup blogs “Despite what the various pundits of the CEP world say, I still think that analytics are an integral part of the CEP stack.”

Mark also says something else I agree with, “… [TIBCO] Business Events [ … is …] a more workflow-oriented product, something that you would NOT use to pump Level2 quotes through and create algo apps.”

Kudos to Marc!  Very insightful. Keep on blogging!


The Infant, the Elephant and the Intelligent Event

June 27, 2008

Fellow blogger Opher Etzion, replies to  On Elephants and Analytics with On Unicorn, Professor and Infant.   Opher is kindly giving us another metaphor to consider, the Infant and the Profession, since we are both big fans of big gentle elephants, babies and our universities.  

Opher and I agree that Infants are not Professors, and we also agree that CEP is in its Infancy and there is overhype by folks often implying CEP is a Professor.     So it seems we all have a huge elephant in the room with an Infant Professor hanging on the end of a wildly swinging Elephant’s trunk!

To keep the blogopoints interesting, I should point out that with all this agreement and Kumbaya campfire singing, there are a couple of things I do disagree with in Opher’s amusing counterpoint. 

First of all, Opher uses the well know debate technique of falsely attributing some easily refutable discussion point and then offering a slam dunk counterpoint.   He does this in this clever, but completely inaccurate Opher quote,

 “I [Opher] respectfully disagree with Tim … in his claim that what has been done until today is just hype and hence totally worthless…”

Folks reading my blog know that I have never said “what has been done until today is … totally worthless.”    This is a misfortunate misquote.  Shame on you Opher!  

What I said, easily read in the blog, was that CEP is overhyped and that most of the self-described CEP software on the market today does not live up to the inflated claims we read and hear from CEP software vendors, the analysts and reporters they influence.

The second counterpoint that I find interesting is Opher’s consistent attempt to redress the dramatic lack of capability and analytics in current generation self-described CEP software by repositioning CEP as “intelligent event processing” (IEP) as he is continues in On Intelligent Event Processing.   

Perhaps Opher will be successful in repositioning the vast majority of the original CEP problem space as IEP.   This is a interesting slippery slope, in my opinion.   The new positioning that Opher is offering is that when “event processing” has advanced analytics, it is not CEP anymore, it becomes IEP because CEP is really “Simple Event Processing” (SEP) – event processing with little to no analytical capability.

I don’t know about most of our readers, but all this positioning and repositioning to match the capabilities, or lack of capabilities, in the current portfolio of self-described CEP software vendors is fascinating.

Here is the next logical question is:

What is the difference between a “Complex Event” and an “Intelligent Event” ?

This could get quite interesting, so stay tuned!


On Elephants and Analytics

June 26, 2008

In On EP and Analytics, good friend and respected colleague Opher Etzion applies the well known metaphor of the big elephant to describe how, if you are observing certain specific domains of a subject, like fraud detection, then your view of the whole elephant is biased by your lack of perspective of the entire big elephant.

I am pleased that dear Opher continues to use this metaphor in counterpoint because the same metaphor can be used to describe the carefully selected group of vendors that have banded together to called themselves CEP Vendors.  This group, many founding members of the EPTS, have formed a merry band of well-intended event processing “specialists” and the same lovely elephant causes this group of bonded colleagues to make elephant-blinded statements, as Opher has made in his quoted post:

“Currently most CEP applications do not require analytics.” 

The reason, I believe, that Opher makes the statement above is because the group of software vendors calling themselves “CEP vendors” represent a very small part of the overall event processing elephant;  and hence, since these self-described CEP applications appear to require very little or no analytics, then, by the same logic, CEP requires no analytics. 

(I should outline the boolean logic in a future post!)

For example, one friend and colleague in Thailand is the CTO of True Internet, a leading telecommunications, voice, Video and Internet service provider in Thailand.   True processes myriad events on their network using a dynamic, self-learning neural networking technology.    The US company providing this very clever and highly recommended event processing application does not call themselves a “CEP vendor”; however, they process complex events better and more interesting than the band of merry self-described “CEP players”.

Again,  visualize the gentle giant elephant metaphor that Opher likes to use as a basis for his comments in CEP counterpoint.

When folks define the term “complex event processing” to match a technology marketing campaign that is primarily driven by software running rules against time-series data streaming in a sliding-time windows, and then go on to take the same software capabilities and apply these capabilities to problems that are suitable for that domain, then you match Opher’s elegant description of “a small view of the overall elephant”.

The fact of the matter is that the overall domain of event processing is at least two orders of magnitude larger (maybe more) than the combined annual revenue of the self-described companies marketing what they call “CEP engines.”  The very large “rest of the big elephant” is doing what is also “complex event processing” in everyday operations that are somehow overlooked in “other” analysis and counterplay.

Therefore,  I kindly remain unmoved from my view  that the self-described CEP community, as currently organized, is not immune to counterpoint using the same gentle giant elephant metaphor.  I like this metaphor and hope well-respected colleagues will continue to use this metaphor; because we can easily apply this elegant manner of discussion to explain why the current group of self-described CEP vendors are, in a manner of speaking, selling Capital Market Snake Oil because they are making outrageous claims about the capabilities of their products, as if they can solve the entire “elephant” of event processing problems.   Recently, in this article, CEP was positioned as a technology to mitigate against corporate megadisasters like the subprime meltdown.

Advice:  Tone down the hype.

Furthermore, the noise in the counter arguments marginalize most of the real event processing challenges faced by customers.

In consistant and well respected rebuttal, Opher likes to use the “glass half-full, half-empty” metaphor.   Opher’s point is a valid attempt to paint my operational realism as “half empty” negativism; while at the same time positioning the promotion of the (narrow) event processing capabilities of the self-described CEP rules community as “half-full” thinking. 

For the record, I do see my worldview as “half full” or “half empty”; but an unbiased pragmatic view based on day-to-day interaction with customers with what they would call “complex event processing” problems. 

These same customers would fall over laughing if we tried to bolt one of these rule-based, time-series streaming data processing engines on their network and told them they can detect anything other than trival business events, business opportunities and threats, in near real-time. 

Is it “half empty” thinking to caution people that a “glass” of software that is being touted as the answer to a wide range of complex (even going so far in a recent news article to imply CEP would have magically stopped the subprime crisis!) tangible business problems is not really as that it is hyped to be?  

If so, then I plead guilty to honesty and realism, with the added offense of a sense of fiscal responsibility to customers and end users.


TIBCO Leaps Ahead in CEP with Insightful Acquisition

June 24, 2008

TIBCO Software shows, yet again, why the team in Palo Alto far outpaces the rest of the field with their announced acquisition of Insightful.  

Everyone who follows The CEP Blog and my vision for the business use of CEP understands how much energy and passion I have put into explaining why the crude time-series analysis of streaming data cannot possibly solve the vast majority of complex business problems CEP must address. 

TIBCO’s acquisition of Insightful shows just how serious TIBCO is about working to make the vision of “Predictive Business” a reality.    TIBCO means business, and a large part of what that means is helping customers solve their most challenging business integration problems, which can be summarized in CEP-speak as detecting opportunities and threats, in near real-time, as a core corporate competency. 

If you spend a few moments on the Insightful web site, you will find a treasure of documentation that discusses a gold mine of advanced statistical analytics that can be used in a number of mission critical applications.

This is the class of analytics that form the backbone of complex event processing.  In fact, as I have often pointed out (to the dismay of some of my CEP colleagues), any software company that discusses CEP and does not support or advocate advanced analytics are selling snake oil.      TIBCO obviously understands the difference between snake oil, smoke-and-mirrors marketing, and the technology it takes to solve real operational problems.

My hats off and warm congratulations to the team in Palo Alto for demonstrating, yet again, why TIBCO is committed to solving real customer problems with realistic solutions.

Maybe TIBCO will evolve to mean “The Insightful Business Company”   versus the tired and stale “The Information Bus Company” of yesteryears?

Disclaimer:  I have not been an employee of TIBCO for over a year. 


Capital Market CEP Fantasy Land

June 23, 2008

In Tech Spending Hit by Subprime Mess, Jeffery Schwartz says,

“According to Tabb, spending on development is being refocused on projects that can help firms improve their margins and, not surprisingly, do a better job at risk management. As such, investments in capabilities such as algorithmic trading and complex event processing (CEP) are likely to be pivotal in some firms’ efforts to become more competitive and improve their efforts at mitigating risks.”

“But for some banks that have deployed such technologies — the now-defunct Bear Stearns, Lehman Brothers, Citigroup and Merrill Lynch — the question is: How did these companies fail to mitigate the risks that have slammed their businesses if their development teams were developing and deploying sophisticated systems?

“There is definitely an awareness that perhaps the systems that existed in place to assess the value of portfolios or judge risk [are being scrutinized],” said Stevan Vidich, an industry architect in Microsoft’s financial services group. “

He added that there is strong interest in CEP and other risk management methodologies. A growing number of shops have started deploying such solutions based on the .NET Framework, Vidich said, and he believes such investments will continue.

“Clearly, there’s a lot of need to deal with the immense influx of data and being able to analyze data in a timely manner,” Vidich said. “It also drives need for systems like business intelligence, or BI, applied to a near-real-time scenario, which is a very attractive proposition.”

What are these guys on Wall Street smoking? 

This is the precise “over hyping” problem I have warned about repeatedly.   Folks selling rule engines that perform basic calculations over a time window of streaming data have been marketing their wares as “superbrains” that can solve very complicated problems and, at the same time, save Wall Street and The Planet.

Let me be perfectly clear here Wall Street.  Listen very carefully.

There is nothing in any of the so called CEP products in the market place that is going to stop losses related to the subprime meltdown effecting the “now-defunct Bear Stearns, Lehman Brothers, Citigroup and Merrill Lynch,” as Jeffery Schwartz implies.

To imply that the risk management (and corporate governance) required to mitigate the current crisis on Wall Street can be foreseen, solved, or even mitigated, by a rules engine (or any software) is complete and absolute fantasy.   

I think the fever created by the subprime flu is putting folks on Wall Street, or at least the vendors and the analysts pandering to them, in a Capital Market CEP Fantasy Land.

 


On the Maturity of CEP

May 31, 2008

Deciphering the Myths Around Complex Event Processing  by Ivy Schmerken stimulated a recent flurry of blog posts about the maturity of CEP, including; Mark Palmer’s CEP Myths: Mature or Not? and Opher Etzion’s On Maturity.

I agree with Ivy.  CEP is not yet a mature technology by any stretch of the imagination.  In fact, I agree with all three of Ivy’s main points about CEP.

In 1998 David C. Luckham and Brian Frasca published a paper, Complex Event Processing in Distributed Systems on a new technology called complex event processing, or CEP (Postscript Version).  In that seminal paper on CEP, the authors said, precisely:

“Complex event processing is a new technology for extracting information from message-based systems.”

Ten years later there are niche players, mostly self-proclaimed CEP vendors,  whom do very little in the way of extracting critical, undiscovered, information from message-based, or event-based, systems.  

A handful of these niche players have informally redefined CEP as “performing low latency calculations across streaming market data.”  The calculations they perform are still relatively straight forward and they focus on how to promote white-box algo trading with commercial-off-the-shelf (COTS) software.  In this domain, we might be better off not using the term CEP at all, as this appears to be simply a type of new-fangled COTS algo trading engine.

The real domain of CEP, we thought, was in detecting complex events, sometime referred to as situations, from your digital event-driven infrastructure – the “event soup” for a lack of a better term.    In this domain, CEP, as COTS software, is still relatively immature and the current self-styled COTS CEP software on the market today is not yet tooled to perform complex situational analysis.

This perspective naturally leads to more energy flowing in-and-around the blogosphere, as folks “dumb down” CEP to be redefined as it benefits their marketing strategy, causing more confusion with customers who want CEP capabilties that have zero to do with low latency, high throughput algo trading, streaming market data processing, which maybe we should call “Capital Market Event Stream Processing” or CESP – but wait we don’t really need more acronyms!

Hold on just a minute!  Wasn’t it just a short couple of years ago that folks were arguing that, in capital markets, it was really ESP, not CEP, remember?  Now folks are saying that it is really CEP and that CEP is mature?   

CEP is mature?  CEP is really not ESP?  CEP is really event-driven SOA?  CEP is really real-time BI?  CEP is really low latency, high throughput, white-box COTs algo trading?  CEP is really not a type of BPM?  CEP is not really for detecting complex events?   Complex does not really  mean complex? 

Come on guys, give us a break! 

(Anyway, no one is going to give us a break….  so stay tuned!)

  


A Vocabulary of Confusion

April 16, 2008

The blog post, On Event Processing Agents, reminds me of a presentation back in March 2006, where TIBCO‘s ex-CEP evangelist Tim Bass (now busy working for a conservative business advisory company in Asia and off the blogosphere, as we all know) presented his keynote, Processing Patterns for Predictive Business, at the first event processing symposium.

In that presentation, Tim introduced a functional event processing reference architecture based on the long established art-and-science of multisensor data fusion (MDSF).   He also highlighted the importance of mapping business requirements for event processing to established processing analytics and engineering patterns.

In addition, Tim introduced a new slide (shown below),  “A Vocabulary of Confusion,” by adapting a figure from the Handbook of Multisensor Data Fusion, overlaying the notional overlap (and confusion) of the engineering components of MSDF with CEP and ESP, to illustrate this confusion:

One idea behind the slide above, dubbed the “snowman” by Tim, was that there is a wealth of mature and applicable knowledge regarding technical and high functional pre-existing event processing applications that span many years and multiple disciplines in the art-and-science of MSDF.     A few emerging event processing communities, vendors and analysts do not seem to be leveraging the art-and-science of multiple core engineering disciplines, including well established vocabularies and event processing architectures.  

On Event Processing Agents implies  a “new” event processing reference architecture with terms like,  (1) simple event processing  agents for filtering and routing, (2)  mediated event processing agents for event enrichment, transformation, validation, (3) complex event processing agents for pattern detection, and (4) intelligent event processing agents for prediction, decisions.

Frankly, while I generally agree with the concepts, I think the terms in On Event Processing Agents tend to add to the confusion because these concepts in On Event Processing Agents are following, almost exactly, the same reference architecture (and terms) for MSDF, illustrated again below to aid the reader. 

Unfortunately, On Event Processing Agents does not reference the prior art:

Event Processing Reference Architecture

My question is why,  instead of creating and advocating a seemingly “new vocabulary” and “new event processing theory”, why not leverage the excellent prior art over the past 30 years?  

Why not leverage the deep (very complex) event processing knowledge, well documented and solving some of the challenging CEP/EP problems we face today,  by some of the top minds in the world?   

Why not build upon the knowledge of a mature pre-existing CEP community (a community that does not call itself CEP) that has been building successful operational event processing applications for decades?

Why not move from a seemingly “not really invented here” approach to “let’s embrace the wealth of knowledge and experience already out there” worldview?

Since March 2006, this question remains unanswered and, in my opinion, the Vocabulary of Confusion,  introduced in March 2006 at the first unofficial EPTS party, is even more relevant today.   Competition is good;  new ideas are good; new perspective are good; however ignoring 30 years of prior art and not leveraging critical prior art is not very good, is it?

Frankly speaking, there is more than enough CEP theory in the art-and-science of MSDF.  If we map the prior art of operational MSDF systems against existing “CEP platforms” we will gain critical knowledge in just how far behind the emerging CEP/EP software vendors are in their understanding of where event processing has been and where the art-and-science is headed.  

Well, enough of blogging for now.   Time to get back to mudane SOA “hearding cats” tasks at Techrotech, so I’ll be back Off The Grid for a while.