Friday, May 22, 2015

The transaction's changing dynamics

Once upon a time in a land far, far away (you know, the 90s), there lived the data base admins.

The data base admins, or DBAs as we will call them, lived a normalized life.  They lived in the land of DASD, speaking the language of SQL.  The cost of real estate in DASD was so high that the poor DBAs were afraid of ever building the same structure twice.

"Normalize!" they would say.  But they would never repeat it!  They worshiped Codd and the 1NF.  They were happy, but, as time went by, they became slow.

Eventually, the DBAs realized that DASD wasn't the only place their data could live.  RAM, and  distributed file systems appeared, and suddenly, the cost of real estate plummeted, and the  top priority went from storage to search & retrieval.  This change was brought upon the DBAs by an invasion of "Users" - from the kingdom of "The Web".  Transactions were no longer entered by experts using green screens - tolerant of multi-second response times ... now "Retail Shoppers" (GASP!) were entering their own transactions - and they didn't tolerage multi-second transactions ... in fact, they wanted almost instant gratification.

Alas, SQL, with it's selects and joins, no longer was fast enough.  Non SQL data bases arose, trading space on DASD for speed in transactions.

Seriously - as the cost of disk has plummeted, the need to spend time, and money on normalizing transactions has become a lower and lower priority.  This trend will continue as long as disk prices continue to fall.  When a complete transaction takes 1 mb to store un-normalized, and 750k to store when normalized, and with the cost of about $0.01 per gb, it doesn't make economic sense to "break it up".

This trend will continue - as compute costs fall - the need to spend time optimizing algorithms for transactions will fall - leading to a "generic" transaction.

Just my thoughts.

BJG

Thursday, February 28, 2008

Beyond Zachman

In times past it used to be that a computer system was written to a set of specifications laid out by the customers, who then waited anxiously for the launch. If the code was well written, and the screens followed the business processes, the effort was considered a success and the celebrations began.

Oh for those simple days!

The advent of Web 2.0 has caused IT to assume a new set of roles. This new class of systems, Web 2.0's "Social Software", often launches without specific customers, use cases or end-user roles defined! To make a confusing situation even more confusing, these new systems are used to solve general problems, not specific problems (as a spreadsheet program solves general problems – thus it can be used for everything from accounting to engineering). This lack of a defined customer and specific issue takes IT out of its traditional (and comfortable) role as guardian and operator of the system (relying upon SDM, Zachman and other frameworks for system implementation and launch) and imposes a new set of roles - as the advertising and marketing arm for the system.

With Web 2.0, IT must assume new roles. These include product marketing (defining and implementing the traditional four "P"s of marketing - Product, Place, Package and Promotion), market definition (inventing prototype use cases to suggest how a product or service may be used), demonstration and sales (working with customers to help them learn how they may use the product), and best practice replication (how are others using it). Additionally, IT will need to adopt a more consumer centric communications mode. Communications can not be restricted to just announcement letters any more, but should include advertising and publicity campaigns. We need to invent new channels for reaching the consumer and actively engage the customers. If Web 2.0 is to be adopted within an Enterprise, it must provide value and become something that everyone knows about and uses.