Bankers Voice Scepticism Over New Event Processing Technologies

This week I completed a presentation on complex event processing at Wealth Management Asia 2007 where I had a chance to field some tough questions from risk management experts working for some of the top banks in the region.

In particular, one of the meeting attendees voiced strong scepticism over emerging event processing technologies.   The basis for his scepticism was, in his words, that the other “65 systems” the bank had deployed to detect fraud and money laundering (AML) simply did not work.  In particular, he referenced Mantas as one of the expensive systems that did not meet the banks requirements. 

My reply was that one of the advantages of emerging event processing platforms is the “white box” ability to add new rules, or other analytics, “on the fly” without the need to go back to the vendor for another expensive upgrade. 

Our friend the banker also mentioned the huge problem of “garbage-in, garbage-out” where the data for real-time analytics is not “clean enough” to provide confidence in the processing results. 

I replied that this is always the problem with stand-alone detection-oriented systems that do not integrate with each other, for example his “65 systems problem.”    Event processing solutions must be based on standards-based distributed communications, for example a high speed messaging backbone or distributed object caching architecture, so enterprises may correlate the output of different detection platforms to increase confidence.   Increasing confidence, in this case, means lowering false alarms while, at the same time, increasing detection sensitivity.

As I have learned over a long 20 year career of IT consulting, the enemy of the right approach to solving a critical IT problem is the trail of previous failed solutions.   In this case, a long history of expensive systems that do not work as promised is creating scepticism over the benefits of CEP.

Advertisements

11 Responses to Bankers Voice Scepticism Over New Event Processing Technologies

  1. peter lin says:

    From my own experience working on pre-trade and post-trade compliance systems. Many existing products, systems and implementations simply don’t work. Many only work in batch processes and can’t handle real-time pre-trade compliance. I know many banks that only run restriction rule validation and don’t bother running diversification or concentration rules. Many of these are govern by 1940Act laws, which regulate funds. In fact, many firms still use excel to manually check regulatory compliance on a quarterly basis. I think ultimately, the CTO and CIO have to know technology intimately or have a team of developers he trusts with deep knowledge and experience.

    In my short 12 years of programming at various jobs, most CTO/CIO are not qualified to make technical decisions. I have met a few excellent C level executives that started out as hard core developers, but that are rare.

  2. kurt says:

    From our recent assignments on consulting for Financial Institutions we find almost every Bank wants to find the holy grail in Business Rules, asking consultants. Providing some of those ends up in the discussion about them instead of actually starting the job 80/20.
    Could someone here maybe share some effective, common rules (e.g. from Apama or Mantas) that ran over trading data effectively?

  3. Tim Bass says:

    Hi Kurt,

    I ran across this a few months ago when i was in the US.

    A US-based security company contacted me about CEP and wanted me to provide them some “generic” rules. I told them that the situation required more thought, based on the type of business, risk profile and behavioral profile.

    After a telecon, they seemed to want to extract information from me for free, just like you indicate in your reply.

    Thanks for visiting!

    Yours faithfully, Tim

  4. Paul Vincent says:

    Sounds like it was an interesting session, Tim. I wonder if the banker had done some root cause analysis of why the existing systems failed. Was there a latency issue in processing the relevant business events? Incorrect process? Organisational resistance to a cross-departmental solution? etc. As Peter says, many (even IT) people still think in terms of batch / procedural / stovepipe solutions. They are right to be sceptical, but wrong to have a closed mind.
    For regulatory compliance generic rules: groups like the OMG Regulatory Compliance group are looking to document existing compliance rules, which could use the new rule standards. Note that implementing compliance rules will simply detect whether you are “not compliant” – you really want to have strategies, policies and rules that *avoid* non-compliance. Detecting that you are not compliant after the fact is interesting, but not good business! Hence the lack of standard cross-organization compliance rulesets…

  5. Tim Bass says:

    Hi Paul,

    The banker mentioned that the main problem, from his perspective, was “dirty data”. He did not mention latency at all. Low latency dirty data is not “better.”

    BTW, I don’t believe, for a minute, that OMG is going to create generic AML rules that are better than the experts, the experts claim that rules do not work well (see prior post from conversations at the cyberdefense initiative conference) and are looking for other (statistical) approaches.

    I wish it was as simple as writing (more or better) rules. Life would be much easier!

    Great to hear from you, Paul.

    Yours faithfully, Tim

  6. Anil datt says:

    Hope you had a nice session.
    I agree with your replies.
    I also believe Sematic Based Events can further reduce the Garbage-in/out – what the banker is mentioning.

  7. Maureen Fleming says:

    Hi Tim,
    Definitely sounds like an interesting session. Data incompatibility is almost always the crusher, whether CEP, SOA, B2B or plain old-fashioned ETL. The problem is so pervasive, it is suprising that progress is happening at all. This banker sounds like he is pretty clueless about how important it is to figure this out as a critical early part of a project — which probably partially explains why none of his fraud detection systems work.

    My best,
    Maureen

  8. Tim Bass says:

    Hi Maureen,

    As I recall, the banker’s concern was not about data incompatibility, but data quality and reliability. He did not strike me as “clueless” – quite the opposite. He simply stated that the “65 fraud and antimoney laundering systems” in his bank “don’t work.” This is true, from my operational experience as well. The issues are related to accuracy, confidence and reliability of the events being processed – and how to extract meaningful context from noise, garbarge, misleading data.

    Data incompatibility is a pretty easy problem to solve, relatively speaking. The main problem is related to turning data into knowledge that is accurate and actionable, with low false positives and high detection of critical situations.

    Yours faithfully, Tim

  9. Maureen Fleming says:

    I am sorry I misread your comments about the skeptical banker. My point was — assuming you want to get a system that is dependent on data integration of some sort to perform — you have to spend time on the data issue first. And if you can’t figure it out, perhaps even because it is impossible because of all the issues that you mention on top of data incompatibility, your system probably won’t work all that well.

    I also pointed out that data quality is often tied to data incompatibility. Even the market called “data cleansing” is a misnomer. Most of the data that pores through those systems have no problem with the actual data — just the ability to relate it appropriately.

    There are other issues tied to setting up rules that take into account all of the different permutations about how you want one piece of data to relate to another.

    To what I interpret as the banker’s point, is this any different when you are working on a CEP project?

  10. Tim Bass says:

    Hi Maureen,

    The issue is more related to trustworthiness than quality; at least that is how I understood his concerns, but perhaps the difference in view is semantics.

    The concern seemed to be about”is the data actually the right data from the right source”.

    For example, if you understand about the ability to processing situations based on news events. What happens when the news event is wrong? Or. what happens when the event is not what it appears to be? Or, what happens when the source is less than accurate? Or, what happens when the source has a conflict of interest?

    Does this fall underneath your view of data quality?

    Yours sincerely, Tim

  11. […] about a few of these meetings and customer concerns.  For example, please read my blog entry about a banker who was very sceptical in a recent wealth management conference in Bangkok.  I see this reaction all the time, in […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: