On Elephants and Analytics

June 26, 2008

In On EP and Analytics, good friend and respected colleague Opher Etzion applies the well known metaphor of the big elephant to describe how, if you are observing certain specific domains of a subject, like fraud detection, then your view of the whole elephant is biased by your lack of perspective of the entire big elephant.

I am pleased that dear Opher continues to use this metaphor in counterpoint because the same metaphor can be used to describe the carefully selected group of vendors that have banded together to called themselves CEP Vendors.  This group, many founding members of the EPTS, have formed a merry band of well-intended event processing “specialists” and the same lovely elephant causes this group of bonded colleagues to make elephant-blinded statements, as Opher has made in his quoted post:

“Currently most CEP applications do not require analytics.” 

The reason, I believe, that Opher makes the statement above is because the group of software vendors calling themselves “CEP vendors” represent a very small part of the overall event processing elephant;  and hence, since these self-described CEP applications appear to require very little or no analytics, then, by the same logic, CEP requires no analytics. 

(I should outline the boolean logic in a future post!)

For example, one friend and colleague in Thailand is the CTO of True Internet, a leading telecommunications, voice, Video and Internet service provider in Thailand.   True processes myriad events on their network using a dynamic, self-learning neural networking technology.    The US company providing this very clever and highly recommended event processing application does not call themselves a “CEP vendor”; however, they process complex events better and more interesting than the band of merry self-described “CEP players”.

Again,  visualize the gentle giant elephant metaphor that Opher likes to use as a basis for his comments in CEP counterpoint.

When folks define the term “complex event processing” to match a technology marketing campaign that is primarily driven by software running rules against time-series data streaming in a sliding-time windows, and then go on to take the same software capabilities and apply these capabilities to problems that are suitable for that domain, then you match Opher’s elegant description of “a small view of the overall elephant”.

The fact of the matter is that the overall domain of event processing is at least two orders of magnitude larger (maybe more) than the combined annual revenue of the self-described companies marketing what they call “CEP engines.”  The very large “rest of the big elephant” is doing what is also “complex event processing” in everyday operations that are somehow overlooked in “other” analysis and counterplay.

Therefore,  I kindly remain unmoved from my view  that the self-described CEP community, as currently organized, is not immune to counterpoint using the same gentle giant elephant metaphor.  I like this metaphor and hope well-respected colleagues will continue to use this metaphor; because we can easily apply this elegant manner of discussion to explain why the current group of self-described CEP vendors are, in a manner of speaking, selling Capital Market Snake Oil because they are making outrageous claims about the capabilities of their products, as if they can solve the entire “elephant” of event processing problems.   Recently, in this article, CEP was positioned as a technology to mitigate against corporate megadisasters like the subprime meltdown.

Advice:  Tone down the hype.

Furthermore, the noise in the counter arguments marginalize most of the real event processing challenges faced by customers.

In consistant and well respected rebuttal, Opher likes to use the “glass half-full, half-empty” metaphor.   Opher’s point is a valid attempt to paint my operational realism as “half empty” negativism; while at the same time positioning the promotion of the (narrow) event processing capabilities of the self-described CEP rules community as “half-full” thinking. 

For the record, I do see my worldview as “half full” or “half empty”; but an unbiased pragmatic view based on day-to-day interaction with customers with what they would call “complex event processing” problems. 

These same customers would fall over laughing if we tried to bolt one of these rule-based, time-series streaming data processing engines on their network and told them they can detect anything other than trival business events, business opportunities and threats, in near real-time. 

Is it “half empty” thinking to caution people that a “glass” of software that is being touted as the answer to a wide range of complex (even going so far in a recent news article to imply CEP would have magically stopped the subprime crisis!) tangible business problems is not really as that it is hyped to be?  

If so, then I plead guilty to honesty and realism, with the added offense of a sense of fiscal responsibility to customers and end users.

Advertisements

Capital Market CEP Fantasy Land

June 23, 2008

In Tech Spending Hit by Subprime Mess, Jeffery Schwartz says,

“According to Tabb, spending on development is being refocused on projects that can help firms improve their margins and, not surprisingly, do a better job at risk management. As such, investments in capabilities such as algorithmic trading and complex event processing (CEP) are likely to be pivotal in some firms’ efforts to become more competitive and improve their efforts at mitigating risks.”

“But for some banks that have deployed such technologies — the now-defunct Bear Stearns, Lehman Brothers, Citigroup and Merrill Lynch — the question is: How did these companies fail to mitigate the risks that have slammed their businesses if their development teams were developing and deploying sophisticated systems?

“There is definitely an awareness that perhaps the systems that existed in place to assess the value of portfolios or judge risk [are being scrutinized],” said Stevan Vidich, an industry architect in Microsoft’s financial services group. “

He added that there is strong interest in CEP and other risk management methodologies. A growing number of shops have started deploying such solutions based on the .NET Framework, Vidich said, and he believes such investments will continue.

“Clearly, there’s a lot of need to deal with the immense influx of data and being able to analyze data in a timely manner,” Vidich said. “It also drives need for systems like business intelligence, or BI, applied to a near-real-time scenario, which is a very attractive proposition.”

What are these guys on Wall Street smoking? 

This is the precise “over hyping” problem I have warned about repeatedly.   Folks selling rule engines that perform basic calculations over a time window of streaming data have been marketing their wares as “superbrains” that can solve very complicated problems and, at the same time, save Wall Street and The Planet.

Let me be perfectly clear here Wall Street.  Listen very carefully.

There is nothing in any of the so called CEP products in the market place that is going to stop losses related to the subprime meltdown effecting the “now-defunct Bear Stearns, Lehman Brothers, Citigroup and Merrill Lynch,” as Jeffery Schwartz implies.

To imply that the risk management (and corporate governance) required to mitigate the current crisis on Wall Street can be foreseen, solved, or even mitigated, by a rules engine (or any software) is complete and absolute fantasy.   

I think the fever created by the subprime flu is putting folks on Wall Street, or at least the vendors and the analysts pandering to them, in a Capital Market CEP Fantasy Land.

 


The Predictive Battlespace

June 11, 2008

Friend and colleague Don Adams, CTO World Wide Public Sector, TIBCO Software, explains how CEP can be used to sense, adapt and respond to complex situations in The “Predictive” Battlespace: Leveraging the Power of Event-Driven Architecture in Defense


A Page from Greg’s Diary: Nerwana Software

March 25, 2008

I started my career in IT many years ago and since that year have worked in enterprise IT for year and years.     Almost all of my odd career story evolves around working with end users, often advising, architecting and managing the complexity of large systems integration projects, from hands on implementation to strategic vision development.  My deep background is with Techrotech in network systems engineering.

A few years ago, years after I started my career at Techrotech, I grew a bit dismayed at enterprise software companies.   They would, for the most part, always come to us, the end users, and try to sell us large software packages.  Their sales and technical teams had very little domain knowledge of the problems they claimed they could solve – and they had little doubt that if we purchased their wares, our problems would be solved,

These software companies were keen on buzzwords and technology jargon but somewhat clueless on operational solutions or the challenges of implementation across a large federated organization with many powerful business units and “in name only” CIOs.  We often referred to these software sales guys, and their favorite systems integrators, as “drive by (or fly by) implementations” where they dump the software (and hardware) at your door and run like crazy!

So, I joined a very cool Silicon Valley company,  Nerwana Software, hoping to change all of that, or so I thought 🙂

Naturally, when I first came on board Nerwana , the entire organization, from executives to recent new hires out of school, heaped praise-upon-praise on my years of operational experience at Techrotech and elsewhere.   They cheered me on as I wrote papers and created slides on operational use cases and event processing solutions that the sales and solutions teams could take to market.   They sang my praises as I spoke to large audiences and evangelized their most innovative software and solutions.  They were pleased with the great reviews from customers.

As one would expect, I was destined to learn the face of the problems I experienced as an end-user “outsider,” now from an “insider’s” perspective. 

One of the interesting challenges that surfaced at Nerwana was the “let’s export our culture and business model to the world” mantra, maybe better referred to as “if it sells in New York, then we must sell it the same way in Tokyo or Bejing!”

Also, I really was surprised to find out how dependent Nerwana was on the opinion of analysts.   When I worked for the customers and end users, we rarely paid any special attention to the analyst’s opinions.   Sure, analysts provides a good data point, but that is all it was (or is), simply another data point.   

I soon found that software companies are often held hostage by “analyst chasing” which really was an eye opener for me, because we end-users, the people who actually buy the software, view analysts as mere mortals reading from the same foggy crystal ball as everyone else. 

Another one of the fasinating challenges I experienced at Nerwana was what some would call  “The Hero Culture.”  

I’ll elaborate on some these, hopefully interesting, observations and experiences in a future Page from Greg’s Diary.


Please Welcome Dr. Rainer von Ammon to The CEP Blog

February 12, 2008

Today is an especially joyful occasion on The CEP Blog.    I am pleased to announce that one of the world’s top experts on CEP, Dr. Rainer von Ammon, has joined the blog.

Dr. Rainer von Ammon is managing director of the Centrum für Informations-Technology Transfer (CITT) in Regensburg. Until October 2005 he was Professor for Software Engineering, specializing in E-Business infrastructures and distributed systems, at the University of Applied Sciences Upper Austria. Rainer is still teaching there and at the University of Applied Sciences of Regensburg. From 1998 to 2002, he worked as Principal Consultant and Manager for R+D Cooperations at BEA Systems (Central and Eastern Europe). Prior to this, he was Professor for Software Engineering in Dresden with a focus on development of applications with event driven object oriented user interfaces and component based application development. Before this Rainer was acting as manager of the field Basic Systems at the Mummert + Partner Unternehmensberatung, Hamburg. After finishing his studies of Information Sciences at the University of Regensburg, he started as project leader of Computer Based Office Systems (COBIS) from 1978 to 1983 and afterward founded a start up company with some of his colleagues.

Some of you may recall my recent musings, A Bitter Pill To Swallow: First Generation CEP Software Needs To Evolve.   When you read Rainer’s excellent reply, you will quickly see why we are very pleased to have his thought leadership here at The CEP Blog.  Dr. von Ammon and his team are leading experts in CEP and related business integration domains.  Not only does he provide thought leadership, his team  researches, develops, implements and tests CEP solutions.   

In another example of  his thought leadership, some of you might recall this post, Brandl and Guschakowski Deliver Excellent CEP/BAM Report, where Hans-Martin Brandl and David Guschakowski of the University of Applied Sciences Regensburg, Faculty of Information Technology/Mathematics, advised by Dr. von Ammon, completed an excellent CEP thesis, Complex Event Processing in the context of Business Activity Monitoring

Please join me in extending a warm welcome for Dr. Rainer von Ammon to The CEP Blog.


Webinar: BAM: The Killer App for CEP

January 31, 2008

Just recently found out that the folks at SL have confirmed an eBiz webinar where I’ll be speaking with colleague Ted Wilson about BAM: The Killer App for CEP.


Key Indicators (KIs) Versus Key Performance Indicators (KPIs)

January 31, 2008

SL‘s new web page, Solutions for CEP Engine Users, discusses how CEP is a “technology that is used to help companies detect both opportunities and threats in real-time with minimal coding and reusable key performance indicators (KPIs) and business models.”

I agree with SL, but would like to suggest my friends at SL expand the notion of KPIs in CEP to include the idea of KIs.  In my opinion, the SL phrase should read,  “technology that is used to help companies detect both opportunities and threats in real-time with minimal coding and reusable key indicators (KIs) and business models.”  

The reason for my suggestion is that KPIs are a subset of KIs.   KIs designate, in my mind, more than just performance.  

CEP is used to both detect opportunities and threats in real-time which may, or may not be, performance related.  For example, when a CEP engine detects evidence of fraudulent behavior, this is a KI.  The knowledge, or pattern, used to estimate this situation is a KI not a KPI, per se.   Also, when a CEP application is processing market data and indicates that it is the right time to purchase an equity and enter the market,  the knowledge used in this decision support application is a KI, not a KPI.

Therefore, I recommend when folks think about the notion of  “key performance indicators” (KPIs) in CEP and BAM, they should also think in terms of “key indicators” (KIs).   Detecting opportunities and threats in real-time are much broader than the traditional notion of KPIs.