Analytical Patterns for Complex Event Processing

October 31, 2007

Back in March of 2006 during my enjoyable times at TIBCO Software, I presented a keynote at the first event processing symposium, Processing Patterns for Predictive Business.   In that presentation, I introduced a functional event processing reference architecture and highlighted the importance of mapping the business requirements for event processing to appropriate processing analytics and patterns.  The figure below is a screenshot of slide 26 of that presentation:

Slide 26

The idea behind the illustration above was that it is essential for organizations to look at their business problems and deterimine the best processing pattern, or processing analytics, in the context of the problem they are trying to solve.   I also graphically illustrated a few examples of event processing analytics relevant to CEP, including:

  • Rule-Based Inference;
  • Bayesian Belief Networks (Bayes Nets);
  • Dempster-Shafer’s Method;
  • Adaptive Neural Networks;
  • Cluster Analysis; and
  • State-Vector Estimation.

The key takeaway for that part of my presentation was that many analytics for CEP exist in the art & science of mature multi-sensor data fusion processing and these analytics can be mapped to recurring business patterns in event processing. I illustrated this point in slide 28 with the figure below (for illustrative purposes only):

Slide 26

In future posts on this topic I will elaborate by discussing analytics at each level of the functional CEP reference architecture, highlighting where different analytical methods and patterns can be efficiently applied to solve real-world event processing business problems.

Advertisements

SL’s Architecture for CEP Visualization

October 26, 2007

In an earlier post, BAM: The Cherry on Top of the CEP Pie?, we touched on the importance of visualization at every level of the function reference architecture for event processing.  

I thought this might be a good time to take a closer look at the SL RTView visualization platform used by many CEP software vendors and their CEP partners, including Apama, BEA, TIBCO, Coral8 and StreamBase

SL's RTV Architecture

SL’s Enterprise RTView is used by many end users to implement and visualize CEP and business optimization KPIs, BAM dashboards, JMX-enabled infrastructure and application monitoring, eCommerce application monitoring, SLAs, capacity planning applications and cyber trading applications.

For example, in an earlier post, CEP Use Case: Stream Processing in Multiplayer Online Gaming, we included a few SL screenshots from a Simutronics-StreamBase on-line gaming monitoring application.  

Stay tuned for other use cases and screenshots that will help you better understand the application and value proposition of CEP to a wide range of business problems.  We will also dive into SL RTView, and other CEP visualization platform architectures, as time permits.


XASAX Launches Into CEP with Virtual Cyber Trading Hosting

October 25, 2007

XASAX has put together a virtual machine CEP hosting solution, at the source of the majority of stock exchanges in the US, using VMWare and clustered hardware. They have implemented a 5 POP financial backbone of market data with proximity hosting / co-location at each exchange throughout the country, which they claim, encompasses 95% of all publicly available market liquidity.

XASAX is currently using a StreamBase CEP engine with Studio seats. However, the XASAX network is touted as agnostic, so users can use any CEP or cyber trading software platform.

Each ticker plant on the XASAX network uses a proprietary message bus for machine to machine connectivity. The CEP engines are connected via Infiniband to a ticker plant. The ticker plants receive raw multicast market data direct from the exchanges.

There are 3 hosted ticker plants on the XASAX network.

1. Exegy (http://www.exegy.com). Hardware accelerated feed handlers providing consolidated market data feeds. They are parsing OPRA (~200 mbps of streaming market data) in ~80 microseconds in one machine using a combination of FPGAs and CPU.

2. InfoDyne (http://www.infodyne.com). Software based parsing. This solution parses OPRA in ~500 microseconds in 4 machines. It is very similar to Wombat.

3. XASAX custom parsers – http://www.opentick.com. XAXAS told us that they have spent the past 4 years developing ticker plant technology, entitlement systems, distribution methods, and feed handlers for exchanges. They are using lightweight versions of this software to drive CEP engines.

StreamBase, at the same time, incorporated opentick APIs this month independent of XASAX. Therefore, StreamBase works with the XASAX software in the initial launch period.

Under the XASAX teaming strategy with CEP vendors, XASAX hopes to bring down the overall cost of deploying a CEP algo trading engine dramatically.

XASAX plans to have ongoing beta tests for new feeds, engines & products. In addition, they plan to maintain beta test trial periods for all hosted applications and virtual servers. In the latter part of November, XASAX plans to go live with Nasdaq feeds & StreamBase in a production cyber trading environment.


Muttering About Rules and CEP

October 23, 2007

I read an excellent post by Paul Vincent, CEP vs. “Business Rules” where Paul does a super job summarizing the role of rules and rule-based systems for CEP. Paul mentioned his posts were motivated, in part, by Opher Etzion’s musings on rules.

In Bending CEP for Rules we discussed the same topic and there were some good comments and interaction, similar to the current discussions at Opher’s and Paul’s blogs.

The point of the matter is that rules have been over-stated in the CEP market, but for CEP to be successful in the long haul, software vendors must offer other analytics to end users, like Bayesian and Neural networks.

Rules-bases analytics are one important class of algorithms for event processing. However, for CEP to survive the test-of-time and become a viable stand-alone-technology class, software vendors must enhance their offerings so uses can easily plug in the event processing analytics then need.

In other words, software vendors should not unintentionally dictate the analytics and methods that users require for their event processing applications. Vendors should provide play-and-play analytical capabilities which permit users to solve a wide range of sophisticated event processing applications.

Today, unfortunately, almost all of the CEP vendors are promoting rule-based engines as their core analytical competency. Because of this situation, isn’t it only natural that there are mutterings about rules and CEP?

Opher replied to Paul’s note that, “CEP [and] Business rules […] are conceptually orthogonal”.

When “orthogonal” is used as an adjective, it means “not pertinent to the matter under consideration.” I think that rules are very pertinent to event processing, so I’m not quite sure I would agree with Opher’s statement about rules and CEP being orthogonal.


SOA Security and SAML – Maturity Defined by Usage Not Time

October 22, 2007

Gerald Beuchelt ridicules my post on SOA security in his reply, Where is the problem? In particular, Gerald takes aim at my statement that SAML, and other SOA security standards, are immature, stating that SAML has been around since 2001.

I agree with Gerald that, if you measure maturity by time (as he does in his reply), then SAML could be considered “mature”.

On other other hand, I am measuring “maturity” by actual usage, and the proof of security solutions is in the actual adoption, not simply years of standards activity and vendor marketing.

For example, here is a WS-Security related quote from Michael Meehan, SOA standards searched for maturity in 2005:

“You can find WS-Security in all SOA products, but almost no one’s using it,” said Burton Group Inc. vice president and research director Anne Thomas Manes. “It’s amazing how few people are using it.”

The same is true for SAML and other security standards for SOA. Yes, there has been a lot of activity for a number of years, and vendors include the products in their sales portfolio, but very few people actual use it to build secure applications.

I measure IT maturity by actual usage. For example, HTTP, SSL, SNMP, IPSEC are “mature” in my opinion, they are used worldwide. SAML, and most of the other SOA-related security standards, are not.


BAM: The Cherry on Top of the CEP Pie?

October 22, 2007

If you read the posts on the net on CEP and BAM you might start to think that the main purpose of visualization in event processing is a BAM dashboard. This is quite a narrow view of both CEP and visualization; so kindly permit me to “debunk the marketing myths” that BAM is simply the cherry on top of the CEP pie.

Turning our attention to the functional reference architecture for event processing, notice that visualization (User Interface) is depicted both (1) outside of the event processing dotted line and (2) attached to a communications (messaging) infrastrucuture.

Event Processing Reference Architecture

Visualization and user interaction is required at all levels of the event processing inference model.

Folks use graphical tools to model event processing data and information flows when they are building and debugging sophisticated event processing applications. There are graphical tools for IT developers and tools for business users.

For example, visualization is required at the event transformation layer. Some companies use graphical XPATH tools, for example, to help map, transform or normalize data from one format to another.

Moving up the event processing inference model to event (object) tracking and tracing, it is necessary to visuzalize the behavior of single event objects before trying to build more complex composite models. For example, you are are interested in extracting user behavior features from web clicks, it is very useful to be able to visualize patterns of normal and abnormal user behavior.

Let’s say you are confident with tracking, tracing and visualizing event objects. What’s next?

You need to be able to combine extracted features from individual event objects and create composite, or derived, events; and you need to be able to visualize these composite events. For example, if you are tracking two or more stocks transactions and looking to optimize your cybertrading strategy, you need to be able to visualize the basic behavioral patterns between low level objects in your model(s) to increase your confidence and optimize performance in the model(s).

When you have built composite event processing models and are beginning to detect real-time opportunities and threats in your business environment, you need to be able to optimize the impact of running these models. Visualization plays an important role here too.

In larger event processing applications, you need to be able to visualize the operational status of your event sources, event processing resources, agents, and other business processes. Visualization is critical to your resource optimization strategy.

When all of the above (and more implied in the picture) is optimized, you might actually have confidence in BAM dashboards showing high level business KPIs and metrics.

The important point to keep in mind is that visualization is important at all inference layers in the functional model of event processing. Business activity monitoring, or BAM, often depicted as a cherry on top of CEP applications, is only a part of CEP visualization pie.


Crossing the Ocean to “Discover” BAM, BI, BPM, BRE, CEP, EDA, ESP, and SOA

October 20, 2007

There have been a number of posts recently about Complex Event Processing (CEP), Business Intelligence (BI) and Business Activity Monitoring (BAM). 

For example, James Taylor, in Complex Event Processing is Not about BI, responds to John Trigg’s The Opportunity for Business Intelligence: Is it Evolution or Revolution? who was motivated by Larry Goldman, Customer Intelligence: Event-Processing Alphabet Soup and Curt Monash, The Era of Memory-Centric BI May Have Finally Started, and Phillip Howard’s Netezza: a Black Swan.

James Taylor makes the point that decision support must be a part of CEP.   John Trigg opines that folks should look at CEP and BI as parts of a larger collaborative solution, not as competing technologies in a buzz word turf war.  Larry Goldman makes that point that with all the buzz about CEP, BI, BPM, EDA, BRE, BAM and ESP, the technology is not revolutionary, but simply evolutionary.  

As for me, I agree with all of these views.

CEP, BI and BAM are simply today’s  buzz words for IT processes that can be implemented in numerous ways to accomplish the very similar “things.”  What is “that”?

Well, “that” is simply to take raw “sensory data” and turn that data  into knowledge that supports actions that are beneficial to organizations; and “that” is precisely the reason I introduced the concept of a fully functional reference architecture into the CEP market in early 2006 (See also, Intrusion Detection Systems & Multisensor Data Fusion, Communications of the ACM, 04/2000, Volume 43, Issue 4,  p.99-105).

In particular, note this post, What is Complex Event Processing? (Part 1), and the figure below.

Event Processing Reference Architecture

All of these market-driven buzzwords are square pegs that fit almost perfectly into the square holes of traditional event processing. There is nothing evolutionary nor revolutionary about the technologies.

Remember when Europeans sailed large ships across the great Atlantic ocean and “discovered” America?  What about the native Americans who lived in harmony with the vast natural resources and “discovered America” long before the foreign ships sailed into their lands.

The same holds true, metaphorically, in information technology. It is encouraging to see folks developing an understanding about solving complex distributed computing problems as they develop, acquire and market software for their customers.

However, this is not new, nor evolutionary nor revolutionary to IT professionals who have been doing this for many years.