What is Complex Event Processing? (Part 5)

May 14, 2007

(Originally Published by Tim Bass, TIBCO Software Inc., May 9, 2007)

In What is Complex Event Processing? (Part 4), we discussed event refinement, also referred to as object refinement or track and trace. Today, in part 5 of What is Complex Event Processing, we discuss the heart of CEP, situation refinement.

Event Processing Reference Architecture

Situation refinement is the functional component of event processing that describes refining multiple event objects in order to identify business situations and scenarios in real-time. Situation refinement analyzes multiple event objects and aggregated groups of events against existing detection templates, patterns, algorithms and historical data to provide an estimate of the current situation and to suggest, identify and predict future opportunities and threats.

Examples of situation refinement are:

– debugging a distributed computing application to determine cause-and-effect relationships between heterogeneous event sources;

– calculation a VWAP on a tracked equity or basket of equities and correlating these event objects with high confidence news reports in real-time ;

– correlating tracked user sessions in an on-line e-commerce application with credit card activities and geolocation information of the same user in real-time;

– associating containers and packages (with RFID, for example) with weather and traffic information to predict delays in shipments in real-time;

– correlating multiple log files selected network devices or applications and searching for a specific patterns or anomalous behavior in real-time;

– analyzing the projected tracks of multiple aircraft, vessels or trains in motion looking for potential collisions before they happen,

– correlating patient information in multiple hospitals looking for trends in viral epidemics and predicting future outbreak areas.

– correlating locations, crews, schedules, cargo, stations and other constraints in a transportation network to optimizing network resources; or,

– correlating the customer information of multiple retail banking channels with real-time customer interaction personnel to enhance the user experience and maximize marketing effectiveness.

It is interesting to note that situations are often referred to as complex events. The terminology (glossary) working group of the Event Processing Technical Society (EPTS) uses the following definitions:

Complex event: an event that is an abstraction or aggregation of other events called its members.

Composite event: Composite event types are aggregated event types that are created by combining other primitive or composite event types using a specific set of event constructors such as disjunction, conjunction, sequence, etc. Note: This definition is from the Active Database terminology

Derived event (also synthesized event): an event that is generated as a result of applying an algorithmic function or process to one or more other events.

Relationships between events: Events are related by time, causality, aggregation, abstraction and other relationships. Time and causality impose partial orderings upon events.

This leads us to the current working EPTS definition of complex event processing:

Complex-event processing (CEP): Computing that performs operations on complex events, including reading, creating, transforming or abstracting them.

In my next post, What Is Complex Event Processing, Part 6, we will discuss another important area in CEP, impact assessment – where detected business situations are compared, correlated, and/or analyzed in “what if” type of scenarios to determine and predict business consequences.

What is Complex Event Processing? (Part 4)

May 14, 2007

(Originally Published by Tim Bass, TIBCO Software Inc., April 30, 2007)

In What is Complex Event Processing? (Part 3), we discussed event preprocessing in event processing applications. Now, in Part 4, we discuss event refinement, also referred to as object refinement or track and trace.

Event Processing Reference Architecture

Event refinement is the functional aspect of event processing that describes refining a single event object, an iterative process of operating on event data to determine attributes of individual event objects and also to build and track their behavioural characteristics. Here, the term event object refers to a distinct object. A track is often constructed based on detections of an individual identified event object, but it can also be indirectly based on detecting actionable parameters of event objects. In addition, event refinement also includes the functionality of state estimation and prediction for individual event objects – the trace aspect.

Examples of event refinement (track and trace) are:

– tracking market transactions in equities and calculating a VWAP on each tracked equity;

– tracking user sessions in an on-line e-commerce application and ranking sessions for likelihood of fraudulent behavior;

– tracking an individual container or package (with RFID, for example) as it travels around the globe and looking for delays or other exceptional conditions;

– tracking a log file in a network device or applications and searching for a specific pattern or anomalous behavior;

– tracking the path of a single aircraft, vessel or train in motion; or,

– tracking a patient in a hospital as they move through various stations and treatments.

Kindly note that in the examples above the event objects are a stream of single stock transactions: an on-line user, a package or container, a log file, an aircraft or a patient. We can all think of many different examples of objects in our businesses than are, or should be, tracked and traced in order to efficiently run the business and search for threats and opportunities to the business.

Event refinement, or track and trace, when applied to digital event information is very similar in functionality to data stream, or event stream processing, ESP. Event stream processing is a very important component of both event processing and complex event processing applications. Steams of events generally consist of event objects related and comparable by time; for example, market transactions of an individual equity are related by the time of execution. Entries in a single log file, the tracking data of an individual aircraft, and other sensor readings are also generally recorded with a time stamp.

Additionally, events may not have known time stamps but are related by causality, ontology or taxonomy. Often the causality is hidden or unknown. Finding hidden causality when debugging distributed systems was the genesis of the work in complex event processing by Dr. David Luckham. Here, the relationships are more complex than tracking and tracing objects in a stream of comparable objects within a known time series.

In my next post, What Is Complex Event Processing, Part 5, we will get into the heart of CEP: analytics where various (multiple) objects are compared, aggregated, correlated and/or analyzed to detect various business situations and scenarios.

What is Complex Event Processing? (Part 3)

May 14, 2007

(Originally Published by Tim Bass, TIBCO Software Inc. , April 27, 2007)

In an earlier blog entry, What is Complex Event Processing? (Part 1), we introduced a functional reference architecture for event processing. Now, we discuss another important component of distributed CEP architectures, event preprocessing.

Event preprocessing is a general functionality that describes normalizing data in preparation for upstream, ”higher level,” event processing. Event preprocessing, referred to as Level 0 Preprocessing in the JDL model (see figure below), is often referred to as data normalization, validation, prefiltering and basic feature extraction. When required, event preprocessing is often the first step in a distributed, heterogeneous complex event processing solution.

Event Processing Reference Architecture

As an illustrative example, visualize a high performance network service that passively captures all inbound and outbound network traffic from a web server farm of 300 e-commerce web servers. We must first normalize the network capture data so that it can be further processed. How to you extract HTTP session information from an encrypted click stream in real time? What information do you forward as an event? Do you just send HTTP header information or other key attributes of the payload? Do you strip out the HTML images files; or do you replace them with the image metadata? These are examples of important questions that must be considered in a web-based event processing application.

In another example, we are building a network management related CEP application and will be correlating events using a stateful, high-speed rules engine. The event sources, for example, are SNMP traps and log file data from two network applications. How do we normalize (transform) the data for event processing? How much filtering is performed at the data source versus at the upstream event processing agent?

Heterogeneous, distributed event processing applications normally require some type of event preprocessing for data normalization, validation and transformation. Some simple applications, for example self-contained processing of well formatted homogeneous streaming market data, require very little preprocessing. However, most classes of complex event processing problems require the correlation and analysis of events from different event sources. BTW, this is a major difference between true CEP classes of problems and event stream processing (ESP) classes of problems. I will discuss this in more detail in a later blog entry.

Often our customers at TIBCO use our BusinessWorks® product to prefilter, normalize and transform raw data into JMS or TIBCO Rendezvous® messages. What is important to remember is that raw data must be transformed (normalized), securely transmitted as an electronic message across the network and formatted in a manner that optimizes event processing throughput.

What is Complex Event Processing? (Part 2)

May 14, 2007

(Originally Published by Tim Bass, TIBCO Software Inc. , April 23, 2007)

In a previous blog entry, What is Complex Event Processing? (Part 1), we introduced a few basic event processing concepts and a functional reference architecture for CEP based on the JDL model for multisensor data fusion. One of the most important concept in our reference architecture is the notion of events, which is the topic of this blog entry today.

What is an Event?

Similar to many topics in science and engineering, the term event has different meanings based on who is observing the event and the context of the observation. Let’s review of few of the different definitions from the point of various observers, keeping in mind that in CEP we are primarily interested in processing events related to business. First, we take a wider survey.

If you are a mathematician, you might view an event via the lens of event probability theory, which states than an event is a set of outcomes (a subset of the sample space) to which a probability is assigned. So, for example, if we were processing many banking application log files, in real-time, looking for fraud, there exists some conditional probability at any moment that a fraud is being orchestrated against the bank. The event is the fraud (detected or undetected outcome); and based on a number of factors, the probability of a fraudulent event against the bank changes over time.

On the other hand, if you are a particle physicist, an event is a single collision of two particles or a decay of a single particle! A collision, in particle physics, is any process which results in a deflection in the path of the original particles, or their annihilation. This view seems to imply that atomic and subatomic exceptions and state transitions are the foundation for events, which may be significant if you are a particle physicist. Assuming most of the readers of the blog are not particle physicists, you may be interested in the draft definition of an event from the Event Processing Technical Society (EPTS) CEP glossary working group, summarized below:

Event: Something notable that happens.


– a financial trade

– an airplane lands

– a sensor outputs a reading

– a change of state in a database, a finite state machine

– a key stroke

– a natural or historical occurrence such as an earthquake

– a social or historical happening, e.g., the abolition of slavery, the battle of Waterloo, the Russian revolution, and the Irish potato famine.

Event (also event object, event message, event tuple): An object that represents, encodes or records an event, generally for the purpose of computer processing. Notes: Events are processed by computer systems by processing their representations as event objects. Events are immutable objects. However, more than one event may record the same activity.


– a purchase order (records a purchase activity)

– an email confirmation of an airline reservation

– stock tick message that reports a stock trade

– a message that reports an RFID sensor reading

– a medical insurance claim document

Overloading: Event objects can contain data. The word “event” is overloaded so that it can be used as a synonym for event object. In discussing event processing, the word “event” is used to denote both the everyday meaning (anything significant that happens) and the computer science meaning (an event object or message). The context of each use indicates which meaning is intended.

As one can see, none of these definitions are completely satisfying! For example, if we look at financial market data, some might observe that it appears a bit pedestrian to say that each trade is an event. Why? Because the market data is the entire sample space and each trade is an element of the set of trades of that particular equity (for example) on a particular day. To call each trade an “event” may be unsatisfactory for some people.

On the other hand, when a business is processing market data using a VWAP algorithm, for some the event occurs, for example, when the price of a buy trade is lower than the VWAP. Conversely, if the price is higher than the VWAP the event would be an indication to sell.

This example tends to more closely align with the mathematicians view of events, an outcome from the sampled set with an assigned probability. The fact that the event is “significant” is due to the context of the theory and application of the VWAP strategy – without VWAP there might be no event, in this context. Similar analogies can be illustrated for fraud detection, supply-chain management, scheduling and a host of other CEP related business problems.

Events are Context Dependent

For example, if you have thousands of packages with RFID tags traveling the globe, is the event when the RFID reader registers the RFID tag? Or is the event when an exception occurs, for example, a lost package? One view is that the RFID reader is simply recording data and the associated RFID data is the sampled set (not necessarily the event). The outcome of interest, with an assignable probability based on the business context, are exceptions, which, in term, become business events. On the other hand, another view might be that each RFID recording is an event, and CEP is detecting “situations,” in this use case, the situation we refer to as “lost package”.

In you are interested in other terms related to CEP, please visit the Draft Event Processing Glossary. Your comments on the glossary are both welcome and much appreciated!

In What is Complex Event Processing (Part 3), I will begin to discuss the functional components of event processing based on the functional reference architecture introduced in What is Complex Event Processing? (Part 1).

What is Complex Event Processing? (Part 1)

May 14, 2007

(Originally Published by Tim Bass, TIBCO Software Inc. , April 23, 2007)

Complex event processing (CEP) is an emerging network technology that creates actionable, situational knowledge from distributed message-based systems, databases and applications in real time or near real time. CEP can provide an organization with the capability to define, manage and predict events, situations, exceptional conditions, opportunities and threats in complex, heterogeneous networks. Many have said that advancements in CEP will help advance the state-of-the-art in end-to-end visibility for operational situational awareness in many business scenarios. These scenarios range from network management to business optimization, resulting in enhanced situational knowledge, increased business agility, and the ability to more accurately (and rapidly) sense, detect and respond to business events and situations.

Possibly, one of the easiest ways to understand CEP is to examine the way we, and in particular our minds, interoperate within our world. To facilitate a common understanding, we represent the analogs between the mind and CEP in a table:

Human Body

Complex Event Processing



Transactions, log files, edge processing, edge detection algorithms, sensors

Direct interaction with environment, provides information about environment

Nervous System

Enterprise service bus (ESB), information bus, digital nervous system

Transmits information between sensors and processors


Rules engines, neural networks, Bayesian networks, analytics, data and semantic rules

Processes sensory information, “makes sense” of environment, formulates situational context, relates current situation to historical information and past experiences, formulates responses and actions

Table 1: Human Cognitive Functions and CEP Functionality


In a manner of speaking, CEP is a technology for extracting higher level knowledge from situational information abstracted from processing business-sensory information. Business-sensory information is represented in CEP as event data, or event attributes, transmitted as messages over a digital nervous system, such an electronic messaging infrastructure.

In order to effective create an processing environment that can sustain CEP operations, the electronic messaging infrastructure should be capable of one-to-one, one-to-many, many-to-one, and many-to-many communications. In some CEP application, a queuing system architecture may be desirable. In other CEP application, a topic-based publish and subscribe architecture may be required. The architect’s choice of messaging infrastructure design patterns depends on a number of factors that will be discussed later in the blog.

Deploying a messaging infrastructure is the heart of building an event-driven architecture (EDA). It follows an EDA is a core requirement for most CEP applications. It is also safe to say that organizations who have funded and deployed a robust high-speed messaging infrastructure, such as TIBCO Rendevouz® or TIBCO Enterprise Messaging System® (EMS) will find building CEP applications easier than organizations who have not yet deployed an ESB.

When an organization has substaintiated an EDA and event-enabled their business-sensory information, they can consider deploying CEP functionality in the form of high-speed rules engines, neural networks, Bayesian networks, and other analytical models. With modern rules engines, organizations can take advantage of powerful declarative programming models to optimize business problems, detect opportunity or threats, improve operational efficiency and more. If the business solution requires statistical models such as likelihood, confidence and probability, event are processed with mathematical models such as Bayesian networks, neural networks or Dempster-Shafer methods, to name a few.

Event Processing Reference Architecture

Figure 1. The JDL Model for Multisensor Data Fusion Applied to CEP

Solving complex, distributed problems requires a functional reference architecture to help organizations understand and organize the system requirements. Figure 1 represents such an architecture. The Joint Directors of Laboratories (JDL) multisensor data fusion architecture was derived from working applications of blackboard architectures in AI. This model has proven to be reliably applicable to detection theory where patterns and signatures, discovered by abductive and inductive reasoning processing, have been the dominant functional data fusion model for decades.[1] Vivek Ranadivé, TIBCO’s founder and CEO, indirectly refers to this model when he discusses how real-time operational visibility, in the context of knowledge from historical data, is the foundation for Predictive Business®.[2]

In part 2 of What is Complex Event Processing, I will begin to describe each block of our functional reference architecture for complex event processing. [3] In addition, I will apply this reference architecture to a wide range of CEP classes of business problems and use cases. I will also address the confusion in terminology and the differences between CEP and ESP (event stream processing), here on the CEP blog, in the coming weeks and months.

  1. Hall, D. and Llinas, J. editors, Handbook of Multisensor Data Fusion, CRC Press, Boca Raton, Florida, 2001.
  2. Ranadivé, V., The Power to Predict, McGraw-Hill, NY, NY, 2006.
  3. Bass, T., Fraud Detection and Event Processing for Predictive Business, TIBCO Software Inc., Palo Alto, CA, 2006.