Entries from January 2009 ↓
January 23rd, 2009 — www
If you look closely – right now – as we speak – a new ecological niche is opening up in the web.
It is just over eight years since we first saw the read-write web and a little more than three years since we heard about web 2.0. In that time, everyone has been gradually building up more and more valuable assets in the web: emails, photos, blogs, collaborations, videos, social networks. For some of us, a large chunk of our lives now has an independent existence in the web.
But something about this has started to become problematic. Our social assets are splattered across dozens of different sites and platforms. Multiple social networking sites vie for our attention. The result is increasing fragmentation of information and its associated problems – duplication and inconsistency. The world-wide-web has rediscovered that old enterprise bogey-man – integration! (or the lack thereof).
Some recent examples include John Udell asking where is SOA when you really need it, and Loic Le Meur lamenting the fragmentation of his social map.
At the same time a raft of new applications is attempting to address these issues:
- Google Open Social aims for social network interoperability.
- OpenID has now achieved broad support (if not success) as a way of managing distributed identity and authentication.
- Ping.fm unifies message posting while FriendFeed aggregates the receiving side.
The key thing about these initiatives is that they all start at the edge of the integration problem. They attempt to support interoperability by unifying the interfaces to these web 2.0 platforms.
The new player that caught my attention recently represents the genesis of “web middleware” in the form of Gnip which bridges the “air gap” between the Producers and Consumers of the social web. And in a beautiful example of parallel evolution, Gnip makes use of wholly web protocols such as XMPP and Atom to provide the functions which are familiar inside the enterprise as JMS and SOAP. Gnip provides connectivity, message delivery and mediation between different data formats. Pinch me if that doesn’t sound just a little like an ESB. But it lives in and has evolved entirely from the web! This is the IT equivalent of discovering the Thylacine in the new world as an evolutionary parallel to the Wolf in the old world.
The funny thing is that while some middleware vendors are trying to figure out how to colonise the cloud (e.g. here and here), the natives are already evolving into that niche.
January 12th, 2009 — architecture
James McGovern provides some excellent thought fodder for all enterprise architects as they sit on the beach these holidays (or otherwise) and contemplate the new year. I’d like to add another point to his list.
One very common issue I see across the industry is the gap between “business” and “IT”. The typical – almost ubiquitous- example is of a cultural gap between the “users” and the “geeks”. In its extreme form this gap can break down into a dysfunctional relationship where productivity grinds to a halt. Too often the response of enterprise architects to this gap is to take a defensive position on the IT side of the fence moat.
Enterprise Architecture is supposed to be about engineering the people, processes and systems so they work together to achieve business outcomes. From this perspective, enterprise architects are in a unique position to bridge the gap between business and IT. In fact I would argue it is the core responsibility to act on this manner. Above all, enterprise architects need to first understand the business requirements and then use their unique blend of technical and business knowledge to help IT deliver the required outcomes.
Yes, the real world and realpolitik will always work against you, but the first step is to start with right attitude.
January 6th, 2009 — cloud-computing
Everyone agrees that 2008 was the year that Cloud Computing Hype took off, but here is the “proof” that it was sometime around October 27 2008.
Click here for larger image
The two graphs show the Google Insights data for search terms related to “cloud computing”, “azure”, “ec2” and “google cloud”. The lower diagram shows the search popularity for each term normalised to a range of 0 to 100. The upper diagram shows the rate of change of these search terms with respect to the overall category of “Computers and Electronics”.
In absolute terms “ec2” had the lions share of search popularity reflecting perhaps its more mature status. “Azure” didn’t appear until September – leading up to its launch on October 27th. To represent Google I had to use a compound term such as “Google Cloud” because “Google” by itself simply swamps the search results and would skew it to unrealistic levels, so it is likely that the Google representation in these “cloud searches” is lower than reality. Using a different term such as “Google App Engine” doesn’t change the conclusions.
The interesting part is the growth in search interest. “Cloud Computing” shows very strong and steady growth throughout the year, peaking around October 27. This is the rate at which search interest was growing, so although search interest will continue to grow in the future, it may never grow as fast as it was in October 2008. Azure may be a late player in the field, but it seems to have sparked a lot of general interest.
Try the analysis yourself at Google Insights for Search
January 4th, 2009 — astronomy, personal
Dave Winer has just reminded me that its been twenty five years since I first used a Mac (thanks Dave).
At the time I was a graduate student in Astronomy at Mount Stromlo Observatory. Our job was to turn photons into paper. Billions of photons traveling from the edge of the universe and the begining of time would land splat on our detectors and be processed into endless shelves of journals lining the library walls – Astrophysical Journal, Monthly Notices of the Royal Astronomical Society et al. One day a harbinger of change showed up in the computer room in the form of a small beige box, right next to the VAX.
Stromlo was and is one of the finest astronomical research facilities on the planet and we had state of the art technology. Every (clear) night, telescopes would amplify the light from distant galaxies onto some of the worlds most powerful photon detectors. The data would be written to 10.5 inch magnetic tapes for later analysis.
We used computers to process all that data. Well actually we pretty much used one computer. The full cohort of Stromlo (about 30 staff and students) all simultaneously used one VAX 11/780. The VAX was commonly accessed via a number of VT100 terminals. Most staff had one of these on their desks. Students shared the communal terminals in the computer room and scattered round common areas. The VT100s were text only. For image processing, you had to book time on one of the two video terminals. Line graphics (graphs, scatter plots, contour plots etc) could also be viewed on one of the three Tektronix 4010 terminals in the computer room. The Tektronix was this really cool little green terminal that could plot line graphics like an etch-a-sketch on speed.
Once your data was analysed you had to prepare a photo-ready manuscript for publication in your selected journal. Typists were available to help turn hand-written notes into well laid-out typeset and draftsmen were available for graphics and mathematical equations. The process was time-consuming because often multiple iterations were required – especially for mathematical equations which were hieroglyphics to your average draftsman. And their bandwidth was limited, with senior staff generally getting priority. We students had to make do with other means.
After a little while of this type of process I learned to use TeX for typesetting papers and equations (and yes, I did write my entire 300 page thesis in TeX). I’d code TeX script on the VT100 and then generate the copy for preview on the Tektronix. If I needed a hard-copy, I would use the Versatec printer. This was an old chemical device that had similar resolution to the Tektronix and would spool out a roll of paper wet with some stinky carcinogens. More presentable copy could be obtained via the dot-matrix printer, but remember your ear-guards and wait significant time for the result.
So this was my state of the art in “word processing” in 1984. Possibly less productive than the manual approach, but I owned every step of the production process and was not beholden to typists or draftsmen – very important for a student on a deadline working nights.
Then one day this beige box appeared in the computer room (I don’t recall if this thing had a laserwriter hooked up to it, but I think it must have). I had a poster-paper to prepare for a conference and procrastination meant I had a short deadline – definitely time to try something new.
The Mac didn’t have an internal disk drive (who knew it was needed) so I had to scrabble around for a boot floppy. This was different to the 5 1/4 inch floppy disk I was used to and was something that South Africans’ disarmingly call a “stiffy”. You needed one of these to boot the Mac and if you actually wanted to do anything useful you had to have some applications on the floppy as well. I had one containing MacWrite and MacPaint.
MacWrite and MacPaint introduced me to the wonderful new world of WYSIWYG. In no time, I got a pretty nicely typeset paper ready with a cool “hand drawn” graphic and all printed out onto A4 paper (with no carcinogens) – ready for the conference.
There was only one small hiccup. Occasionally MacWrite would pause and emit a “wrrr wrrr” sound which I was later to realise meant the program was saving the document to disk. On about page 6, the computer went into permanent wrrr-mode. The disk had filled up and it was thrashing to find space to save the document. After running around in a panic I eventually solved the problem by saving the document to a new floppy. But it definitely meant the capacity of this thing was limited.
And that was my first mac experience.
This first Mac wasn’t really all that useful. I don’t think I went back to it again until the Mac SE came out some time later with internal hard drive and better software for word processing and spreadsheets. But the WYSIWYG genie was out of the box. Not long after this, the typists and draftsmen would be gone – reassigned to other duties. A couple of years later, the WYSIWYG philosophy would extend to data analysis when the first Sun “pizza boxes” put a computer on everyones desk more powerful than the VAX.
Now everyone owns their word-processing requirements from start to finish. Beautifully typeset papers – camera ready – can be obtained on your own desktop computer with minimal effort. Something we are familiar with in every work environment today, but it wasn’t so 25 years ago.