Entries from July 2009 ↓

Health Memes Again

One of my stints in the US was in the early nineties when I lived in Baltimore MD, at that time the shootingest city in the Union. My daily (bus) commute from South Baltimore across North Avenue to Johns Hopkins University was an eye-opening tour across the strata of American society. First Lady Hillary Clinton was pushing her pet project for universal healthcare in the US, which actually gained quite a bit of traction and certainly got the whole country talking. NPR ran endless reports comparing US health standards with European. But then some Republican stood up and said “you can’t have universal healthcare, that’s…Socialism!“. The public response was unequivocal:

“Oh yeah, we forgot. Sorry…”

Twenty years later, the USSA seems to be waking from its long healthcare slumber and taking some action – perhaps emboldened by having recently nationalised all the banks. This time, Information Technology seems to have a strong role to play as the discussion encompasses electronic medical records (EMR) and the part that IT can play in making healthcare work more efficiently.

Clearly this is a lofty and worthy goal for IT, but there are also some really interesting technical aspects to the application of IT to EMR. There are the hard-nosed “enterprise” requirements for scalability, reliability and security across a complex and distributed system-of-systems. Add to this our experiences from “Social Software” and the role it can play in helping different parties – physicians and patients – collaborate on maintaining accurate medical records. Knowledge Management, Semantic Web and Decision Systems also have a contribution in helping to automate decisions and discovery in a highly technical, specialised and always-changing field. EMR has it all!

So it’s not surprising to see a lot of discussion on this topic in the blogosphere. A few interesting references recently:

First welcome back Adam Bosworth with a series of great posts on EHR to launch his latest venture. I missed Adam’s writings since 2005 when he disappeared into the Googleplex. Adam bolts out of the gate with ten posts in one day (I suspect he just discovered his network cable unplugged for the last three weeks).

John Udell presents an interesting podcast with Peter O’Toole discussing Electronic Medical Records and (among other things) the application of expert systems to aid the entry of specialised data.

Finally (but not least and I think not last), Robert Cringley starts a series on the application of IT and complex systems theory to medical records.

Certainly an interesting area to watch with ramifications not only in the US, but certainly in Australia and across much of the western world where healthcare reform and efficiency is firmly on the agenda.

CEP and Reflex

Imagine you are walking along a jungle trail and you see a distant orange-black furry shape (kind of cuddly) heading your way. You infer it’s coming closer because it gets larger and that growling noise gets louder. Those large yellow toothy things look sharp and…and…well you’re gone!

On the other hand, imagine you’re walking down a jungle trail and you spot a tiger in the distance. Now you’ve got a chance to escape.

The difference between these two scenarios is the processing time taken to recognize a threat and then respond to it and I frequently use this analogy to illustrate the value of Complex Event Processing (CEP) and how CEP complements traditional Business Intelligence (BI).

The current state of BI art – data warehouses, analytical tools and reporting – is akin to the first scenario. Here you identify a threat or opportunity by analyzing facts from the world around you. This is a useful activity and there is nothing inherently wrong with it. The problem arises when you rely on BI techniques to drive a quick response.

Stimulus-Response Cycle for Traditional Business Intelligence

As the figure above illustrates, the problem with BI is the lead time required to derive first a result and then a reaction. Many BI systems use a combination of batch ETL and data-marts to migrate transactional data into a data warehouse. Then most analytics tools are designed for periodic post-hoc analysis by a small cadre of specialists. This means that the lead time for recognizing a threat or opportunity can be on the order of days, weeks or even months.

We can however short circuit part of this BI cycle by taking advantage of CEP within the transactional data stream. We effectively decouple analysis from recognition and assign those functions to the most efficient component within the solution.

bicep-cycle1

The new CEP+BI lifecycle is shown in the next figure. BI techniques are still used to classify and understand the opportunities or threats, but we don’t rely on BI to drive a reaction. Instead, opportunities and threats are parameterized into a set of rules which CEP can apply to transactional data in real time.

In other words, analytics tells you that tigers are dangerous, CEP allows allows you to spot the tiger before it eats you!.