Entries Tagged 'cep' ↓
November 30th, 2010 — cep
A simple use-case:
1. Unhappy customer publishes Tweet containing negative sentiment.
2. Correlate Tweeter with frequent flyer membership.
3. Eight hours later, Tweeter receives feedback offer from vendor.
Is this automated marketing reaction, or is it just a coincidence?
If the former, it’s a simple example of Event Processing – a single event detected, classified (sentiment), correlated with standing data and a response dispatched.
Did they need to operate in realtime? Probably not…same day is good enough (but no longer!)
Did they need to use a CEP engine for this? Probably not (perhaps they did it in PHP).
Did they get the right result?
November 30th, 2010 — cep
An interesting post by Colin Clark lamenting the inability of CEP to live up to earlier growth expectations. The article is definitely worth reading in full, but if I can pull out a few cogent points, I believe Colin ascribes lack of CEP growth to:
- CEP is mostly a marketing phrase.
- Vendors have focussed tightly on High Frequency Trading and neglected other areas of opportunity.
- Vendors have diss’ed developers by forcing them to learn new and arcane language(s).
- Vendors have neglected business users by neglecting visualization requirements.
In broad terms I agree – although I’m not sure languages are an impediment given the explosion of new language interests in the mainstream. I think the fundamental problems are two-fold:
CEP products haven’t yet crossed the chasm from toolbox to platform. They are still very technical and incomplete. Most CEP products concentrate on the “engine” and neglect two really important areas – visualization (as pointed out by Colin) and context. Event processing relies on understanding events within the broader business context which requires no barriers between the event stream and other systems of record or operational data stores. This is an extremely challenging technical problem – how to marry real-time data streams with large volumes of data “at rest”.
The business value for CEP is not always obvious. Unless you’re involved in a high stakes, low latency arms race moving at ever more relativistic velocities, batch will usually work for you. Most organizations don’t yet operate in realtime. Those outside of HFT that do or plan to operate in real-time are doing some work with CEP (e.g. Telcos, Utilities and Logistics) but there the challenges are around my first point – integrating existing/legacy network optimization applications with the event stream. In such situations, it’s the optimization technology that drives the implementation, not the event processing technology.
So whither CEP?
Ultimately CEP has three pre-requisites: the business need to operate in real-time, the IT infrastructure to support this and the ability to analyse events within the context of all relevant data assets. The CEP “product” comes at the end of a long line of dependencies.
July 17th, 2009 — cep
Imagine you are walking along a jungle trail and you see a distant orange-black furry shape (kind of cuddly) heading your way. You infer it’s coming closer because it gets larger and that growling noise gets louder. Those large yellow toothy things look sharp and…and…well you’re gone!
On the other hand, imagine you’re walking down a jungle trail and you spot a tiger in the distance. Now you’ve got a chance to escape.
The difference between these two scenarios is the processing time taken to recognize a threat and then respond to it and I frequently use this analogy to illustrate the value of Complex Event Processing (CEP) and how CEP complements traditional Business Intelligence (BI).
The current state of BI art – data warehouses, analytical tools and reporting – is akin to the first scenario. Here you identify a threat or opportunity by analyzing facts from the world around you. This is a useful activity and there is nothing inherently wrong with it. The problem arises when you rely on BI techniques to drive a quick response.
As the figure above illustrates, the problem with BI is the lead time required to derive first a result and then a reaction. Many BI systems use a combination of batch ETL and data-marts to migrate transactional data into a data warehouse. Then most analytics tools are designed for periodic post-hoc analysis by a small cadre of specialists. This means that the lead time for recognizing a threat or opportunity can be on the order of days, weeks or even months.
We can however short circuit part of this BI cycle by taking advantage of CEP within the transactional data stream. We effectively decouple analysis from recognition and assign those functions to the most efficient component within the solution.
The new CEP+BI lifecycle is shown in the next figure. BI techniques are still used to classify and understand the opportunities or threats, but we don’t rely on BI to drive a reaction. Instead, opportunities and threats are parameterized into a set of rules which CEP can apply to transactional data in real time.
In other words, analytics tells you that tigers are dangerous, CEP allows allows you to spot the tiger before it eats you!.
October 25th, 2008 — cep, woa
Gartner has nominated their top 10 strategic technologies for 2009. In priority order they are:
1. Virtualization
2. Business Intelligence
3. Cloud Computing
4. Green IT
5. Unified Communications
6. Social Software and Social Networking
7. Web Oriented Architecture
8. Enterprise Mashups
9. Specialized Systems
10. Servers – Beyond Blades
My primary interests in SOA and CEP are represented in items 7 and 2 respectively (with some applicability to 8).
Business Intelligence. Business Intelligence (BI), the top technology priority in Gartner’s 2008 CIO survey, can have a direct positive impact on a company’s business performance, dramatically improving its ability to accomplish its mission by making smarter decisions at every level of the business from corporate strategy to operational processes. BI is particularly strategic because it is directed toward business managers and knowledge workers who make up the pool of thinkers and decision makers that are tasked with running, growing and transforming the business. Tools that let these users make faster, better and more-informed decisions are particularly valuable in a difficult business environment
(emphasis added). Complex Event Processing is a key enabling technology for real-time business intelligence. The ability to abstract “rules” out of data using analytics and then have those rules executed in real-time to match against “events” running around your enterprise bus is what will make BI achieve the needs of business to make informed decisions “faster”. Traditional BI where you feed data into a mammoth DWH and then analyse the results at some point later suffer from high latencies and reduced information currency. CEP can reduce these latencies from weeks or months down to seconds.
Web-Oriented Architectures. The Internet is arguably the best example of an agile, interoperable and scalable service-oriented environment in existence. This level of flexibility is achieved because of key design principles inherent in the Internet/Web approach, as well as the emergence of Web-centric technologies and standards that promote these principles. The use of Web-centric models to build global-class solutions cannot address the full breadth of enterprise computing needs. However, Gartner expects that continued evolution of the Web-centric approach will enable its use in an ever-broadening set of enterprise solutions during the next five years.
Recognition that SOA is more than just “Web Services”and that at least one other architectural paradigm is available to SOA implementers.
Interesting that BPM has dropped from the list after featuring in the previous two years. And SOA itself has not been mentioned since the list for 2006. It looks like SOA has become more business as usual – part of the IT “furniture” – as I predicted some time ago.
October 20th, 2008 — cep
I will be manning the booth tomorrow afternoon at the TIBCO SOA Online Summit. The topic of discussion is Events and Realtime Business Intelligence which is an active part of my portfolio these days. Two of the guys from my blogroll will be presenting – James Taylor from Smart (Enough) Systems and Paul Vincent from TIBCO. Please come along and say “hi”.