Where next for the mainframe? - part 1
The mainframe has been around for more than six decades, but does it have a place in a 21st century tech stack?
The mainframe has been around for more than six decades and in that time has earned itself an enviable track record as the platform for critical applications, bulk data and transaction processing across different sizes of business in multiple industry sectors. And, whilst some regard the mainframe as the ultimate legacy, it remains a trusted system: a secure and stable platform on which to reliably deliver robust business processes and services.
Throughout its time the mainframe has weathered endless cycles of technology change, and yet proved time and time again its value as a robust, reliable and secure platform for running enterprise applications. Even though the mainframe concept dates back to the 1950s, it has gone though many changes while continuing to support applications that were created decades ago. Today, mainframes can support Linux operating systems to access open source software, in addition to many significant initiatives including cloud, mobile computing, big data and business analytics.
One of the fundamental problems for the mainframe is that this reliable workhorse tag creates an image of complex, ageing, costly, legacy infrastructure. Over the years, new technology and modern architectures have led companies to wrestle with the question as to whether the mainframe should stay or go. Despite this, mainframes continue to anchor industries such as banking, insurance, healthcare and retail, as well as the public sector where heavy-duty, high volume, low latency transaction processing is required.
In recent years, digital transformation (DX) has generally left the mainframe unloved, instead focusing on building a digital veneer around it like an outer skin; this gives the impression of a digital platform, even though the power behind it is still a mainframe built on years of effort invested in critical business processing logic. As a result, many organisations are either afraid to tackle the issue of modernisation (because of the perceived complexity and cost) or choose not to, on the basis that it is not necessary in order to continue to deliver the digital services that their users expect. Consequently, organisations remain justifiably nervous about a mainframe-based future, particularly if the company competes in an industry that is rapidly changing and is being influenced by innovation that is challenging to sustain under the restrictions of legacy technologies.
Ultimately the decision to continue using a mainframe is not an either/or proposition. Too often, the perception of the mainframe as yesterday's technology choice means it is not considered as a viable platform as part of a company's IT strategy. However, this fails to recognise the role of the mainframe in a hybrid IT environment - there is a role for both, and mainframes and new platform strategies can happily coexist. A holistic IT strategy will ignore the myths about the usability or viability of the mainframe and provide an objective evaluation of the cost, benefit and risk of switching technologies before any decision is made.
Next week, we'll examine the drivers for change that might cause a re-evaluation of the mainframe as a business tool, and how you can address them.
Elisabeth Ash is the customer journey manager at New World Tech (NWT), an independent consultancy specialising in mainframe migration strategy and planning.