Entries from November 2010 ↓
November 30th, 2010 — cep
A simple use-case:
1. Unhappy customer publishes Tweet containing negative sentiment.
2. Correlate Tweeter with frequent flyer membership.
3. Eight hours later, Tweeter receives feedback offer from vendor.
Is this automated marketing reaction, or is it just a coincidence?
If the former, it’s a simple example of Event Processing – a single event detected, classified (sentiment), correlated with standing data and a response dispatched.
Did they need to operate in realtime? Probably not…same day is good enough (but no longer!)
Did they need to use a CEP engine for this? Probably not (perhaps they did it in PHP).
Did they get the right result?
November 30th, 2010 — cep
An interesting post by Colin Clark lamenting the inability of CEP to live up to earlier growth expectations. The article is definitely worth reading in full, but if I can pull out a few cogent points, I believe Colin ascribes lack of CEP growth to:
- CEP is mostly a marketing phrase.
- Vendors have focussed tightly on High Frequency Trading and neglected other areas of opportunity.
- Vendors have diss’ed developers by forcing them to learn new and arcane language(s).
- Vendors have neglected business users by neglecting visualization requirements.
In broad terms I agree – although I’m not sure languages are an impediment given the explosion of new language interests in the mainstream. I think the fundamental problems are two-fold:
CEP products haven’t yet crossed the chasm from toolbox to platform. They are still very technical and incomplete. Most CEP products concentrate on the “engine” and neglect two really important areas – visualization (as pointed out by Colin) and context. Event processing relies on understanding events within the broader business context which requires no barriers between the event stream and other systems of record or operational data stores. This is an extremely challenging technical problem – how to marry real-time data streams with large volumes of data “at rest”.
The business value for CEP is not always obvious. Unless you’re involved in a high stakes, low latency arms race moving at ever more relativistic velocities, batch will usually work for you. Most organizations don’t yet operate in realtime. Those outside of HFT that do or plan to operate in real-time are doing some work with CEP (e.g. Telcos, Utilities and Logistics) but there the challenges are around my first point – integrating existing/legacy network optimization applications with the event stream. In such situations, it’s the optimization technology that drives the implementation, not the event processing technology.
So whither CEP?
Ultimately CEP has three pre-requisites: the business need to operate in real-time, the IT infrastructure to support this and the ability to analyse events within the context of all relevant data assets. The CEP “product” comes at the end of a long line of dependencies.
November 20th, 2010 — soa
In systems architecture, there are rarely any right answers – mostly just trade offs between one solution or another. In such cases it helps to bear in mind some fundamental principles as a guideline. One principle I often use is cost vs. benefit. Another useful principle is to minimize coupling between systems. Coupling is pervasive and leads to a kind of inertia in enterprise systems. Newton discovered that inertia prevents change and if there is one thing that enterprises struggle most with, it’s change.