A Bitter Pill To Swallow: First Generation CEP Software Needs To Evolve

February 8, 2008

Frankly speaking, the CEP market is now saturated with hype about all the great things CEP can do, detecting opportunities and threats in real time and supporting the decision cycle.  However, in my opinion, it is time for the software vendors and analysts to move beyond the marketing hype and demonstrate real operational value with strong end user success, something seriously lacking today.

I have advocated this evolution for two years, including the notion of expanding CEP capabilities with proven techniques for event processing that have worked well long before current “Not yet CEP but called CEP” software hit the marketplace and airwaves.

For example, in my first CEP/EP presentation in New York in 1Q 2006, I presented Processing Patterns for Predictive Business and talked about how the US military has implemented high performance detection-oriented systems for many years (in the art-and-science of multisensor data fusion, MSDF), and how every day, when we sit at home (or at work or in transit), we are comforted to know we are safe from missile attacks because of what I would also call “complex event processing.”   There is a very rich history of “CEP but not called CEP” behind the scenes keeping people safe and warm. (The same thing can be said with many similar examples of complex event processing in use today, but not called “CEP” by CEP software vendors.)

This is one reason, when I read the “CEP history lessons,” I am amused at how, at times, the lessons appear self-serving, not end user serving.  There is so much rich event processing history and proven architectures in “CEP but not called CEP” (CEP that actually works, in practice everyday, long before it was called CEP).  It continues to puzzle me that a few people the CEP/EP community continue to take the “we invented EP” view.  Quite frankly, the history we read is missing most, if not all, of the history and practice of MSDF.

When we take the current CEP COTS software offerings and apply it to these working “CEP but not called CEP” applications, the folks with real operational “CEP but not called CEP” detection-oriented experience quickly cut through the hype because they are, based on their state-of-the-practice, now seeking self-learning, self-healing “real CEP type” systems.  They are not so excited about first generation technologies full of promises from software vendors with only a few years of experience in solving detection-oriented problems and very few real success stories.

The same is true for advanced fraud detection and other state-of-the-art detection-oriented processing of “complex events” and situations.  The state-of-the-art of complex event processing, in practice, is far beyond the first generation CEP engines on the market today. 

This is one of the reasons I have agreed with the IBM folks who are calling these first generation “CEP orchestration engines” BEP engines, because that view is closer to fact than fiction.  Frankly speaking again, process orchestration is much easier than complex detection with high situation detection confidence and also low false alarms.

Customers who are detection-savvy also know this, and I have blogged about a few of these meetings and customer concerns.  For example, please read my blog entry about a banker who was very sceptical in a recent wealth management conference in Bangkok.  I see this reaction all the time, in practice. 

Complex problems are not new and they still cry out for solutions.  Furthermore, many current-generation event processing solutions are already more advanced that the first generation CEP engines on the “call it CEP” market today.  This is a very real inhibitor, in my opinion, to growth in the “call it CEP” software space today – and credibility may ultimately be “at risk.”  Caution is advised.

Candidly speaking again, there are too many red-herring CEP-related discussions and not enough solid results given the time software vendors have been promoting CEP/EP (again, this is simply my opinion).  The market is in danger of eventually losing credibility, at least in the circles I travel and complex problems I enjoy solving, because the capabilities of the (so called) CEP technologies by software vendors in the (so called) CEP space have been over sold; and, frankly speaking, I have yet to see tangible proof of “real CEP capabilities” in the road maps and plans of the current CEP software vendors.  This is dissappointing.

This pill is bitter and difficult to swallow, but most of my life’s work has been advising, formulating and architecting real-time solutions for the end user (the C-level executives and the operational experts with the complex problems to solve).   CEP software must evolve and there needs to be more tangible results, not more marketing hype.


Cyberattack! Manipulation and Subversion of Financial Markets!

January 8, 2008

In The Top Ten Cybersecurity Threats for 2008 I mentioned one of most critical cybersecurity threats of 2008 will be the manipulation and subversion of financial markets.

Well, folks in New York have barely finished cleaning the colorful New Years confetti off Wall Street and a vivid example of manipulating the market appears in the news.  We have yet to see the second week of January and the cyberattacks are upon us.

Notice how a direct competitor to E*Trade Bank, Citigroup, used the power of cyberspace, rumors and (mis)information to manipulate the market price of  E*Trade (ETFC).  This might have not been such an eyebrow raising event if the rumor (cyberattack) was by a disinterested third party.  The attack was by a direct competitor with their own subprime balance sheet problems!

On or about November 12, 2006, Citigroup Investment Research analyst Prashant Bhatia, a longtime critic of E*Trade (ETFC), sent E*Trade stocks into a free fall with his cyberattack, “[E*Trade] Bankruptcy risk cannot be ruled out.”  This rumor, by a competing financial institution, basically wiped out countless millions of dollars in investor equity in a few short minutes.  Amazing!

Of course, E*Trade is not going bankrupt any more than Citigroup is going bankrupt, but the information warfare by Citigroup’s Prashant Bhatia continues.  Here is the lastest:

On January 7th, 2008, Bhatia changes his cyberattack slightly and reiterates his “prediction”  that E*Trade will not be profitable for at least three years.  This statement was quite different than his earlier attack where he “predicted” bankruptcy for a competing E*Trade.    

Is Prashant Bhatia or Citigroup accountable for these “predictions”?   When investors are wiped out in seconds based on a cyberattack by a competing financial institution, who is responsible for these actions and investor losses?  Or, are these attacks simply “capitalism” and “the free market” where cyberattacks are commonplace and investors are no more than pawns in the market battlesphere?

Some on the net have said that Prashant Bhatia’s actions are not much different than the owner of a competing movie theater, who stands up and yells, FIRE!! FIRE!! FIRE!!   Then,  after he clears the theater, he then yells, RATS!! COCKROACHES!!  POISON POPCORN!!  to insure no one comes back to the movie house. 

To me, this is an amazingly obvious cyberattack with a direct purpose to manipulate the market and cause damage to a competing financial services institution.   

How can this attack be allowed to happen?

I think anyone can easily see how this is a very serious cybersecurity threat in 2008 and beyond.  Unfortunately, these types of attacks will certainly get worse before it gets better.  Welcome to “the real world”….   In this vividly real example, an analyst from a competing bank yells FIRE! and basically subverts the market, causing countless of investors to lose many millions upon millions of dollars in the blink of an eye.   Wow.

RISK is the intersection of THREAT, VULNERABILITY and CRITICALITY. 

E*Trade was vulnerable (as were other banks) due to subprime issues.  E*Trade Financial is a critical competitor (and threat) to Citibank.    This is a textbook cyberattack in today’s world.   

Dr. David Luckham, discussing financial news and complex event processing (CEP) recently asked, How well does Elementizing work? and reminds us” News Moves Markets.”   He is certainly correct, obviously.

The cybersecurity issue  that I am highlighing here is how easy it is for news to be manipulated and, in turn, subvert markets, as easily observable with Citigroup’s Prashant Bhatia and his commercial cyberattacks against E*Trade Financial and, most importantly, their investors.


Coral8: Event Stream Processing and Intrusion Detection

January 3, 2008

Not quite ready for prime-time, we have been testing our home-grown UNIX domain socket adapter using Coral8 Java APIs.   We are using this adapter to evaluate and demonstrate stream processing with intrusion detection systems (IDS) using event stream processing to reduce false alarms, detect derived situations from the raw intrusion event data, and feed a security management visualization dashboard.

You can click on the teaser image below to see more of our first IDS screenshots from Coral8’s Studio stream visualization tool.

Coral8 IDS Example

If you click on the image above, you will four additional event stream properties.  For this part of the demo, there are 14 total IDS properties in the event stream, but we only show 5 properties in this cropped screen capture.

I am quite sure that we could do similar integration with other event stream processing engines, but fortunately Coral8 makes it easy to download, start developing and testing. 


OpenCourseWare: Get Smart for Complex Event Processing!

December 30, 2007

Ready to move beyond the basics of event processing?   Perhaps you would like to beef up your Java skills?   The Basics of Signal Processing?  Or maybe you are interested in Advanced Complexity Theory?   Artificial IntelligenceComputer Language EngineeringQueueing Theory?

Well then, put your feet up, relax and click on over to the Department of Electrical Engineering and Computer Science at MIT OpenCourseWare  (OCW) and enjoy their courses, freely available to anyone, anywhere. 

MIT’s OCW program freely shares their lecture notes, exams, and other resources from more than 1800 courses spanning MIT’s entire curriculum, including many fields related to event processing.     There are even RSS feeds on new courses as they hit the wire, so you don’t have to miss a thing!  Also, check out the OSC Consortium.

Complex event processing is a multi-discipline approach for detecting both opportunities and threats in real-time cyberspace.   Make it your New Years Resolution to review a few OCW lectures and help advance the state-of-the-art of CEP!


Complex Event Processing with Esphion Neural Agents

December 19, 2007

Detection-oriented technologies generally fall into two broad areas, signature-based detection and anomaly-based detection.    Complex event processing (CEP) is also a detection-oriented technology, so we can readily understand that CEP applications must also fall within the same two general areas.

Signature-based detection is sometime referred to as static detection because the technology relies on pre-defined rules, filters, and signatures to match known patterns.  At the most fundamental level, a virus checking program is an example of a signature-based system. 

On the other hand, anomaly-based detection systems strive to maintain a baseline of what is considered normal and then matches patterns outside normal operating parameters, often usings adaptive or artifical intelligence techniques.

Experts know that both anomaly and signature-based detection methods are important and each have their unique challenges and engineering tradeoffs.  For example, signature-based systems tend to generate false negatives because it is not possible to write all possible rules and filters to match every pattern, especially in dynamic real-time environments. Anomaly-based detection, on the other hand, tends to generate false positives because it is quite difficult to create a perfect profile of normal behavior. 

The challenge in most, if not all, detection-oriented systems is finding the right balance between false positives and false negatives.  In some situations, a system should error toward false positives.  In other applications, the system should error toward false negatives. 

CEP is, by defination, a technology to detect both opportunities and threats in distributed networks, in real-time, so it goes without saying that CEP is challenged by the same engineering tradeoffs that affect other detection-oriented systems.

A few weeks ago, I was discussing CEP with a CTO of one of Thailand’s largest telecommunications companies and he was very bullish on neural-based anomaly detection and from Esphion.

First generation detection systems rely on determinism, which is generally rule-based, and known to be insufficient for more complex real-time problems.  Esphion uses neural agents to gathering information on network activity and then creates a unifying situational infrastructure to protect against previously unknown threats.   For example, a fast spreading threat, such as the SQL/Slammer worm, will have reached all possible targets faster than any signature can be published or rule can be written, as mentioned in Worm detection – You need to do it yourself.

Since CEP is designed and marketed as a technology that brings real-time advantages to the detection of both opportunties and threats, we must ask ourselves the question why do all the current CEP software vendors fail to provide non-deterministic methods that are proven to adapt to a rapidly changing world?  

In Anomaly Detection 101, Esphion does a great job of describing how they do not rely on any pre-specified rules, baselines, models, signatures, or any other apriori knowledge.   They claim, and my highly respected telecommunications CTO colleague confirms, that there is no prior knowledge required and their customers are no longer adversely affected by zero-day anomalies or changing network conditions.

The technology behind Esphion does is what I would call complex event processing.


End Users Should Define the CEP Market.

December 17, 2007

My friend Opher mistakenly thought I was thinking of him when I related the story of the fish, as he replied, CEP and the Story of the Captured Traveller.

I must not have related the fish story very well, because to understood the story of the fish, is to know that we are all like the fish, in certain aspects of life, and there is nothing negative to be gleaned from the story.

However, to Opher’s point on CEP, I disagree.   Just because the marketing people (not the market) has misdefined CEP and therefore the vendors are drifting from the technology described in Dr. Luckham’s original CEP work, including his CEP book, we should not change the context of CEP.    Therefore, I don’t agree we should redefine CEP, as David envisioned, as Intelligent Event Processing (IEP) because CEP, as today’s software vendors sell it, is really SEP (or whatever!)  Please recall that David’s background at Stanford was AI and he did not define CEP as the software vendors have defined it either!

The fact of the matter is that the software marketing folks have decided they are going to use Dr. Luckham’s book to sell software that does not perform as Dr. Luckham described or envisioned!   I make no apologies for being on the side of end users who actually need to solve complex problems, not sell software that underperforms.

As I mentioned, this positioning and repositioning does not help solve complex problems.   At the end of the day, we have problems to solve and the software community is not very helpful when they place form over substance, consistently. 

Furthermore, as most customers are saying, time and time again, “so what?” … “these COTS event processing platforms with simple joins, selects and rules do not solve my complex event processing problems.”  “We already have similar approaches, where we have spent millions of dollars, and they do not work well.”

In other words, the market is crying out for true COTS CEP solutions, but the software community is not yet delivering.  OBTW, this is nothing new.  In my first briefing to the EP community in January of 2006, I mentioned that CEP required stating the business problem, or domain problem, and then selecting the method or methods that best solve the problem or problems.

To date, the CEP community has not done this because they have no COTS tool set other than SEP engines (marketed as either ESP engines or CEP engines – and at least ESP was closer to being technically accurate.) 

Experienced end users are very intelligent. 

These end users know the complex event processing problems they need to solve; and they know the limitations of the current COTS approaches marketed by the CEP community.  Even in Thailand, a country many of you might mistakenly think is not very advanced technologically, there are experts in telecommunications (who run large networks) who are working on very difficult fraud detection applications, and they use neural networks and say the results are very good.   However, there is not one CEP vendor, that I know of, who offers true CEP capability in the form of neural nets.  

Almost every major bank, telco, etc. has the same opinion, and the same problem. They need much more capability than streaming joins, selects and rules to solve their complex event processing problems that Dr. Luckham outlined in his book.   The software vendors are attempting to define the CEP market to match their capability; unfortunately, their capabilities do not meet the requirements of the vast majority of end users who have CEP problems to solve.

If the current CEP platforms were truely solving complex event processing problems, annual sales would be orders of magnitudes higher.  Hence, the users have already voted.   The problem is that the CEP community is not listening.


CEP and the Story of the Fish

December 17, 2007

Every month or two someone in the CEP community makes a statement like “Hey, there is more to complex event processing than processing simple streams!” or “SQL and rules are not the final chapter in the saga of event processing!”

Each time the issue surfaces, there are a few voices in the CEP community who argue that there is really nothing more than stream processing and they give the same simple examples to prove their case.   It is obvious, at least to me, that they have never worked in large scale network management or cybersecurity; yet they claim to have most, if not all of, the answers, a very simple construct which applies to all complex problems.  They seemingly argue and debate every point or detail that they do not understand.

This reminds me of the age old story of the fish, famous in Asian studies.

This fish lives in a vast ocean, wide and deep, full of life.   In fact, the entire world of the fish is the deep blue sea.   The fish does not know of the land and understands very little, if anything, about the sky.    The fish is quite content in its beautiful world of the sea.  Why would it know of the land?

One day a turtle tells this very happy fish about incredible wonders beyond the sea and air breathing creatures that live on the land.    

The fish strong denies any possibility and discounts the entire story by his friend the turtle.  In fact, the fish says there is only water, and only sea creatures, and only vast oceans and deep blue seas.  The fish goes on to say it has never seen this dry place called land, or met any of these funny creatures that move freely on the land with legs and without gills.  “What a ridiculous story,” says the fish!

Regardless of the turtle’s good intentions to inform our good friend the fish about the wonders of the land, the fish simply cannot accept the turtles’s story about the land, because the entire world of the fish is this beautiful deep and vast ocean.   There is no dry land! 

The same is true in the CEP community.   The world is full of complex event processing applications than cannot be solved by rules, time-ordered stream processing and strict determinism.  In fact, most of the more interesting problems are simply too complex to fit neatly into some rigid set of deterministic rules.

There is complexity and complexity theory, probability and statistics.   There are tradeoffs between detection sensitivity and false alarms, very difficult problems without “nice and neat” solutions.

The fish cries out “don’t waste my time with folly about this nonexistant place called land!”

Likewise, there is a small group of folks in the CEP community who are steadfast in their debate in the idea that there is more than simple event processing, and we call this,

Complex Event Processing (CEP)