top of page
  • Paul Burgess

Technology Fundamental - the three platforms and the data mess

As I think the beginnings of my technology life, it was defined by companies and technology that no longer exist. If you said 'water chilled mainframe', batch-based (you waited for your time slice) or 'green bar' print outs - that was the latest of the day. As the evolution of technology continued, we moved to air-cooled multi-processing, real-time processing and client-server WYSIWYG GUI (driven by the all imperative Y2K. The last 20+ years has seen that continued drop in compute, storage and network cost with amazing performance to bring us thin client/browser based computing everywhere.

With these fundamental changes in price/performance characteristics, we have found more means to consume the cheaper compute, fill the ever growing availability of storage and use that network to stream everything.


Consider in 1964 Arthur Clarke predicted with amazing clarity this exact change and evolution.



What he did not predict was DATA. The integrations, the myriad of sources, the machines creating data, desire to analyze and predict from it all - and the security, legal and criminal threats of all this now available computing.


Like 'kids in a candy store' the more money you give them, the more they will buy and eat. All this cheap compute, easy access to the internet/networking and storage drives measures in terabytes - corporations and governments are having a diabetic shock.

Rather a company is 100 years old or 10 - they get into data bloat. WHY? Customers come and customers go, contracts run out, products age out - and beyond where information systems may be 'modern' but 'tired data' grows. The newest Salesforce system with 50 customer records representing a client organization is a bigger problem compared to a 3270 mainframe green screen. When I recently rented a car with Enterprise, the agent had a big 27 inch monitor that had a single mainframe green screen to find my record and check out my car - with the contract printed on a dot matrix printer.


If 'data is truly the new oil', like oil we need to refine it into the mix of useable finished products for it to be useful. But because of Moore's Law maintaining it's pace of making compute, memory, storage and networking cheaper and cheaper - the software and data running on it needs to evolve better and faster.

After using 'Google' as my research on this topic, found this very interesting research titled "DIGITAL TRANSFORMATION and the role of APPLICATION DECOMMISSIONING". What was interesting was the specific question "Why do you think older systems are still running in your organization?" with the top answer being 'no business case' which is astonishing. But it comes back to the pain and suffering in changing the software and data layers is GREATER than just allowing the status-quo.


What should you do:

  • Take the data bloat out of your technology and data - but respecting proper governance.

  • Do not be enslaved by your legacy thinking, platforms or software vendors.

  • Partner with your technology vendors AND if they don't act like partners back - get rid of them.

  • Get rid of technology through effective application retirement/decommissioning of old technology and data and focus on the new. You cannot move forward otherwise.

  • Politics be damned - full steam ahead!








28 views0 comments
Post: Blog2_Post
bottom of page