Models and Reductionism – Reducing Clouds Into Streams

Reducing complex problems sets to simple problem sets is an interesting, and sometimes valid, approach to complex event processing.    Transformations can be useful, especially when well defined.

For example, CEP was envisioned as a new technology to debug relatively large distributed systems, discover hidden causal relationships in seemingly disconnected event space.    This “discovery” requires backwards chaining with uncertainty, for example.  Most of the current so-called “CEP software” (on the market today (including Marc Adler’s SQL-based examples) do not perform backwards chaining (with uncertainty).   This is also true from other so-called CEP products, like most forward chaining RETE engines – for example, see this post.

Marc Adler says he is, “hunting for advice from people who might have implemented event clouds in Coral8, Streambase, and Aleri, all three which are based on SQL.”

Current steaming SQL engines cannot model true event clouds without reducing the cloud to causal-ordered sets of linear streaming data.   These software tools are stream processing engines that process events in a time window of continuous streaming data.  These products are not, in reality, CEP engines – calling them “CEP engines” is marketing-speak, not technology-speak!

Reducing complex models to simple ones is a valid technique for problem solving.  Likewise, eliminating uncertainty and assuming causality is a way to reduce complexity. 

CEP was envisioned to discover causal relationships in complex, uncertain, “cloudy data” and the current state-of-the-art of software from the streaming SQL vendors do not have this capability, unless you reduce all event models to ordered sets of streaming data (reduce POSETS to TOSETS).

Reductionism can be a valid technique, of course.  We can eliminate uncertainty, force strict determinism, demand apriori system rules and perform all sorts of tricks to permit us to reduce complex problems to simple ones.  

However this also results in reducing CEP  (complex event processing) to SEP (simple event processing).  



5 Responses to Models and Reductionism – Reducing Clouds Into Streams

  1. Peter Lin says:

    As far as I know, here are the engines that provide backward chaining functionality.

    ART – created by Inference corp. Paul Haley worked on it
    OPSJ – engine written by doctor forgy
    JESS – written by Ernest

    In terms of uncertainty, there a lot of different ways to handle it in an expert system. One way is with fuzzy logic, which is independent of forward/backward chaining. I know Harold Boley of RuleML has done a lot of work with deductive and inductive logic and there’s lots of different ways of implementing it.

    Paul’s post makes a great point about using backward chaining for deductive reasoning. I think more basic research is needed to figure out practical strategies for handling uncertainty. Many of the techniques for handling uncertainty with machine learning has a very steep learning curve. Most developers aren’t going to be able to be productive without spending several years studying the field.


  2. Greg Reemler says:

    Hi Peter,

    Thanks for your comments!

    This is one of “the risks” of the emerging CEP markets, calling forward chaining (without uncertainty, and with strict determinism) linear stream processing, “complex event processing”. In fact, what is being marketed and promoted as CEP is not actually CEP (per David Luckham’s original paper and book on CEP) and this constant over-hyping of the CEP market does not serve the end users who have very challenging CEP classes of detection-oriented problems to solve.

    Best Regards,


  3. peter lin says:

    I forgot to include Haley Authority, which also supports backward chaining. Paul Haley also wrote that engine.

  4. Greg Reemler says:

    Hi Peter,

    As you know far better than I, FYI for folks here:

    Haley was acquired by RuleBurst, who then rebranded as Haley.

    FWIW, RuleBurst also acquired FraudSight, a company with neural network analytics.,ruleburst-acquires-financial-crime-specialist-fraudsight.aspx


    FraudSight has been responsible for detecting around $100 million worth of fraud over the last two years by applying data analytics and neural networking to financial crimes including online banking fraud and money laundering. FraudSight’s customers include three of Australia’s largest banks and a significant number of financial institutions.

  5. peter lin says:

    yeah, Ruleburst is based in Australia. They bought haley last year and paul left after the acquisition. Neural net stuff is fascinating, but using it properly and effective takes quite a bit of work. It’s not something a developer can pick up in a few days or even a few weeks. Takes a lot of time to get a solid grip on neural nets.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: