OpenCourseWare: Get Smart for Complex Event Processing!

December 30, 2007

Ready to move beyond the basics of event processing?   Perhaps you would like to beef up your Java skills?   The Basics of Signal Processing?  Or maybe you are interested in Advanced Complexity Theory?   Artificial IntelligenceComputer Language EngineeringQueueing Theory?

Well then, put your feet up, relax and click on over to the Department of Electrical Engineering and Computer Science at MIT OpenCourseWare  (OCW) and enjoy their courses, freely available to anyone, anywhere. 

MIT’s OCW program freely shares their lecture notes, exams, and other resources from more than 1800 courses spanning MIT’s entire curriculum, including many fields related to event processing.     There are even RSS feeds on new courses as they hit the wire, so you don’t have to miss a thing!  Also, check out the OSC Consortium.

Complex event processing is a multi-discipline approach for detecting both opportunities and threats in real-time cyberspace.   Make it your New Years Resolution to review a few OCW lectures and help advance the state-of-the-art of CEP!

Advertisements

Complex Event Processing with Esphion Neural Agents

December 19, 2007

Detection-oriented technologies generally fall into two broad areas, signature-based detection and anomaly-based detection.    Complex event processing (CEP) is also a detection-oriented technology, so we can readily understand that CEP applications must also fall within the same two general areas.

Signature-based detection is sometime referred to as static detection because the technology relies on pre-defined rules, filters, and signatures to match known patterns.  At the most fundamental level, a virus checking program is an example of a signature-based system. 

On the other hand, anomaly-based detection systems strive to maintain a baseline of what is considered normal and then matches patterns outside normal operating parameters, often usings adaptive or artifical intelligence techniques.

Experts know that both anomaly and signature-based detection methods are important and each have their unique challenges and engineering tradeoffs.  For example, signature-based systems tend to generate false negatives because it is not possible to write all possible rules and filters to match every pattern, especially in dynamic real-time environments. Anomaly-based detection, on the other hand, tends to generate false positives because it is quite difficult to create a perfect profile of normal behavior. 

The challenge in most, if not all, detection-oriented systems is finding the right balance between false positives and false negatives.  In some situations, a system should error toward false positives.  In other applications, the system should error toward false negatives. 

CEP is, by defination, a technology to detect both opportunities and threats in distributed networks, in real-time, so it goes without saying that CEP is challenged by the same engineering tradeoffs that affect other detection-oriented systems.

A few weeks ago, I was discussing CEP with a CTO of one of Thailand’s largest telecommunications companies and he was very bullish on neural-based anomaly detection and from Esphion.

First generation detection systems rely on determinism, which is generally rule-based, and known to be insufficient for more complex real-time problems.  Esphion uses neural agents to gathering information on network activity and then creates a unifying situational infrastructure to protect against previously unknown threats.   For example, a fast spreading threat, such as the SQL/Slammer worm, will have reached all possible targets faster than any signature can be published or rule can be written, as mentioned in Worm detection – You need to do it yourself.

Since CEP is designed and marketed as a technology that brings real-time advantages to the detection of both opportunties and threats, we must ask ourselves the question why do all the current CEP software vendors fail to provide non-deterministic methods that are proven to adapt to a rapidly changing world?  

In Anomaly Detection 101, Esphion does a great job of describing how they do not rely on any pre-specified rules, baselines, models, signatures, or any other apriori knowledge.   They claim, and my highly respected telecommunications CTO colleague confirms, that there is no prior knowledge required and their customers are no longer adversely affected by zero-day anomalies or changing network conditions.

The technology behind Esphion does is what I would call complex event processing.


End Users Should Define the CEP Market.

December 17, 2007

My friend Opher mistakenly thought I was thinking of him when I related the story of the fish, as he replied, CEP and the Story of the Captured Traveller.

I must not have related the fish story very well, because to understood the story of the fish, is to know that we are all like the fish, in certain aspects of life, and there is nothing negative to be gleaned from the story.

However, to Opher’s point on CEP, I disagree.   Just because the marketing people (not the market) has misdefined CEP and therefore the vendors are drifting from the technology described in Dr. Luckham’s original CEP work, including his CEP book, we should not change the context of CEP.    Therefore, I don’t agree we should redefine CEP, as David envisioned, as Intelligent Event Processing (IEP) because CEP, as today’s software vendors sell it, is really SEP (or whatever!)  Please recall that David’s background at Stanford was AI and he did not define CEP as the software vendors have defined it either!

The fact of the matter is that the software marketing folks have decided they are going to use Dr. Luckham’s book to sell software that does not perform as Dr. Luckham described or envisioned!   I make no apologies for being on the side of end users who actually need to solve complex problems, not sell software that underperforms.

As I mentioned, this positioning and repositioning does not help solve complex problems.   At the end of the day, we have problems to solve and the software community is not very helpful when they place form over substance, consistently. 

Furthermore, as most customers are saying, time and time again, “so what?” … “these COTS event processing platforms with simple joins, selects and rules do not solve my complex event processing problems.”  “We already have similar approaches, where we have spent millions of dollars, and they do not work well.”

In other words, the market is crying out for true COTS CEP solutions, but the software community is not yet delivering.  OBTW, this is nothing new.  In my first briefing to the EP community in January of 2006, I mentioned that CEP required stating the business problem, or domain problem, and then selecting the method or methods that best solve the problem or problems.

To date, the CEP community has not done this because they have no COTS tool set other than SEP engines (marketed as either ESP engines or CEP engines – and at least ESP was closer to being technically accurate.) 

Experienced end users are very intelligent. 

These end users know the complex event processing problems they need to solve; and they know the limitations of the current COTS approaches marketed by the CEP community.  Even in Thailand, a country many of you might mistakenly think is not very advanced technologically, there are experts in telecommunications (who run large networks) who are working on very difficult fraud detection applications, and they use neural networks and say the results are very good.   However, there is not one CEP vendor, that I know of, who offers true CEP capability in the form of neural nets.  

Almost every major bank, telco, etc. has the same opinion, and the same problem. They need much more capability than streaming joins, selects and rules to solve their complex event processing problems that Dr. Luckham outlined in his book.   The software vendors are attempting to define the CEP market to match their capability; unfortunately, their capabilities do not meet the requirements of the vast majority of end users who have CEP problems to solve.

If the current CEP platforms were truely solving complex event processing problems, annual sales would be orders of magnitudes higher.  Hence, the users have already voted.   The problem is that the CEP community is not listening.


CEP and the Story of the Fish

December 17, 2007

Every month or two someone in the CEP community makes a statement like “Hey, there is more to complex event processing than processing simple streams!” or “SQL and rules are not the final chapter in the saga of event processing!”

Each time the issue surfaces, there are a few voices in the CEP community who argue that there is really nothing more than stream processing and they give the same simple examples to prove their case.   It is obvious, at least to me, that they have never worked in large scale network management or cybersecurity; yet they claim to have most, if not all of, the answers, a very simple construct which applies to all complex problems.  They seemingly argue and debate every point or detail that they do not understand.

This reminds me of the age old story of the fish, famous in Asian studies.

This fish lives in a vast ocean, wide and deep, full of life.   In fact, the entire world of the fish is the deep blue sea.   The fish does not know of the land and understands very little, if anything, about the sky.    The fish is quite content in its beautiful world of the sea.  Why would it know of the land?

One day a turtle tells this very happy fish about incredible wonders beyond the sea and air breathing creatures that live on the land.    

The fish strong denies any possibility and discounts the entire story by his friend the turtle.  In fact, the fish says there is only water, and only sea creatures, and only vast oceans and deep blue seas.  The fish goes on to say it has never seen this dry place called land, or met any of these funny creatures that move freely on the land with legs and without gills.  “What a ridiculous story,” says the fish!

Regardless of the turtle’s good intentions to inform our good friend the fish about the wonders of the land, the fish simply cannot accept the turtles’s story about the land, because the entire world of the fish is this beautiful deep and vast ocean.   There is no dry land! 

The same is true in the CEP community.   The world is full of complex event processing applications than cannot be solved by rules, time-ordered stream processing and strict determinism.  In fact, most of the more interesting problems are simply too complex to fit neatly into some rigid set of deterministic rules.

There is complexity and complexity theory, probability and statistics.   There are tradeoffs between detection sensitivity and false alarms, very difficult problems without “nice and neat” solutions.

The fish cries out “don’t waste my time with folly about this nonexistant place called land!”

Likewise, there is a small group of folks in the CEP community who are steadfast in their debate in the idea that there is more than simple event processing, and we call this,

Complex Event Processing (CEP)


Simple Event Processing != Complex Event Processing

December 16, 2007

One of the brillant minds in the CEP community, Claudio Paniagua Macia, recently posted, Event Stream Processing != Complex Event Processing.   In his post, Claudi draws a bold conclusion:

(1) SQL-based approaches to ESP might have a hard time doing CEP.

(2) No real CEP engine exists today in the marketplace, perhaps not even “off” the marketplace.

Friend, colleague, and co-chair Opher Etzion replied, On Event Stream Processing:

 “CEP engines do exist today, none is perfect, but probably sufficient for big majority of the existing applications today.”

Respectfully, I find it necessary to agree with Claudi and disagree with Opher.   Most of the so-called CEP engines today are solving quite simple event processing problems.   If the CEP engines on the market were truly solving a “majority of the exisiting applications today” then sales would be orders of magnitudes larger.

The fact-of-the-matter is that the current “simple rules-based approach” dominate in today’s marketplace are used to solve problems where rules-based approaches are useful.   Unfortunately, this is just a small fraction of the true potential of the CEP market.

For example (just one example of many), the vast majority of intrusion or fraud detection systems available today use rule-based approaches, and their detection capability, and the confidence in the detection, is quite elementary (poor quality).   If these systems worked well, cyberspace would be a very different and much safer place.  

Yes, it is useful to add another layer of rules, but rules alone will not solve the vast majority of CEP-domain classes of problems.   In addition, the CEP applications that have made the press recently are quite simple, certainly nothing scientifically earth shattering.

So, the sad truth of the matter, from an architectural, scientific and solutions perspective, is exactly as Claudi boldly offered, no real CEP engine exists today.    Furthermore, the vast majority, if not all, CEP applications sold today are used in very simple event processing (SEP) applications.  This is not very “advanced,” but it is a good start.  

What is holding the CEP market back is quite straight forward; the current “engines” are quite elementary (We should call them SEP engines.), relatively speaking, and SEP engines do not have the capability to solve difficult detection-oriented CEP problems in cyberspace.   These difficult problems compose the vast majority of the applications where “true complex event processing” is required.


CEP Center of Excellence for Cybersecurity at Software Park Thailand

December 16, 2007

In July 2007, at InformationSecurityAsia2007,  I unveiled an idea to create a cybersecurity CEP Center of Excellence (COE) in Thailand.  Under the collaborative guidance of Dr. Rom Hiranpruk, Deputy Director, Technology Management Center, National Science and Technology Development Agency (NSTDA), Dr. Prinya Hom-anek, President and Founder, ACIS Professional Center, and Dr. Komain Pipulyarojana, Chief National Security Section, National Electronics and Computer Technology Center (NECTEC), this idea continues to move forward.

Today, in a meeting with Mrs. Suwipa Wanasathop, Director, Software Park Thailand, and her executive team, we reached a tentative agreement to host the CEP COE at Software Park.   

The mission of Software Park Thailand is to be the region’s premier agency supporting entrepreneurs to help create a strong world-class software industry that will enhance the strength and competitiveness of the Thai economy.

Since 2001, Thailand’s software industry has experienced approximately 20% year-over-year (YOY) growth.  Presently, Software Parks Thailand supports a business-technology ecosystem with over 300 active participants employing over 40,000 qualified software engineers across a wide range of technology domains.

I am very pleased that Software Park Thailand is excited about the potential benefits of CEP in the area of cybersecurity and detection-oriented approaches to cyberdefense. The COE will be working with best-of-breed CEP vendors to build, test and refine rule-based (RBS), neural network (NN) based and Bayesian network (BN) based approaches (as well as other detection methods) for cybersecurity.

I will be announcing more details in the future, so stay tuned.  Please feel free to contact me if you have any questions.


Bankers Voice Scepticism Over New Event Processing Technologies

November 28, 2007

This week I completed a presentation on complex event processing at Wealth Management Asia 2007 where I had a chance to field some tough questions from risk management experts working for some of the top banks in the region.

In particular, one of the meeting attendees voiced strong scepticism over emerging event processing technologies.   The basis for his scepticism was, in his words, that the other “65 systems” the bank had deployed to detect fraud and money laundering (AML) simply did not work.  In particular, he referenced Mantas as one of the expensive systems that did not meet the banks requirements. 

My reply was that one of the advantages of emerging event processing platforms is the “white box” ability to add new rules, or other analytics, “on the fly” without the need to go back to the vendor for another expensive upgrade. 

Our friend the banker also mentioned the huge problem of “garbage-in, garbage-out” where the data for real-time analytics is not “clean enough” to provide confidence in the processing results. 

I replied that this is always the problem with stand-alone detection-oriented systems that do not integrate with each other, for example his “65 systems problem.”    Event processing solutions must be based on standards-based distributed communications, for example a high speed messaging backbone or distributed object caching architecture, so enterprises may correlate the output of different detection platforms to increase confidence.   Increasing confidence, in this case, means lowering false alarms while, at the same time, increasing detection sensitivity.

As I have learned over a long 20 year career of IT consulting, the enemy of the right approach to solving a critical IT problem is the trail of previous failed solutions.   In this case, a long history of expensive systems that do not work as promised is creating scepticism over the benefits of CEP.