Progressive Data Constraints

As a follow up to my last post, Richard Veryard described the concept of Post Before Processing whereby rules are applied once you have safely captured the initial information. This is a good way of managing unstructured or semi- structured data and it reminded me of other cases where data is progressively constrained and or enriched during processing.

Content management systems use this technique at the unstructured end of the spectrum where source and draft content is pulled into the system from the “jungle” and then progressively edited, enriched, reviewed etc until the content is published. Programmers will be familiar with this in the way that source code is progressively authored, compiled, unit tested and integrated via source code control (plus a bunch of QA rules and processes).

Many computer-aided design applications also provide this ability to impose different rules through the lifecycle of a process. Many years ago I worked on a CAD application for outside plant management for Telcos which had a very interesting and powerful long transaction facility. Normal mode representing the current state of the network enforced an array of data integrity and business rules – such as what cables could be connected to each other and via what type of openable joints etc.

In the design mode, different rules are in place so that model items can be edited and temporarily placed into invalid states. This is necessary because of the connected nature of the model (a Telco network) and the fact that the design environment reflects the future state of the network which may not correctly “interface” to the current network state. The general mode of operation was multiple design projects created by different network designers who managed the planning and evolution of the network from it’s current state to some future state. And multiple future states could potentially exist within the system at any point in time. Design projects follow a lifecycle from draft through proposed and into “as built”. This progression is accompanied by various rules governing data visibility, completeness and consistency.

This is a useful model of how to manage data which may be complex, distributed or collaboratively obtained and managed. Effectively building a process around a long transaction which manages transition of data between states of completion or consistency.

Master Data Management is another topical example of this type of pattern. In this scenario data is distributed across systems or organizations. Local changes trigger processes which manage the dissemination of changes to bring the various parts of the data into consistency. During this process different business rules may be applied to manage the data in transition.

I think these concepts can be more generally applicable to SOA design-time governance. For example in the collaborative design of enterprise schemas or services contracts.

1 comment so far ↓

#1 Richard Veryard on 11.21.08 at 8:54 pm

Hi Saul

When I wrote my piece on Post-Before-Processing, I wasn’t thinking about how it might apply to the design process. My first reaction was, well that’s interesting but completely different. But when I read it again more slowly, I started to see a similar pattern between designing an engineering system (using CAD) and designing a response to a complex unstructured set of events (CEP-ish).

… to be continued on my blog …

Leave a Comment