Unlocking the value of modernising the ‘big iron’

Far from being on the way out, mainframes − or the ‘big iron’ as they are affectionately known – are reputed to run 30 billion transactions per day, hold 80% of the world’s business data and handle 90% of all credit card transactions. So, if you performed an online transaction today, it is likely a mainframe made that possible.

The new normal brought about by the COVID-19 pandemic has unquestionably accelerated the massive shift to digital, with a boom in virtual interactions and software aimed at minimising physical contact.

Conference calls reign supreme, businesses are hosting virtual events, and consumers have shifted their shopping habits to online for just about everything. All is being driven by the need to diminish or completely remove physical contact, which in turn reduces the risk of infection and spread of the virus. Touchless is now the preferred form of purchasing nearly everything. This trend is the new normal across industries and is also driving demand for the mainframe.

Because mainframe systems become increasingly costly and cumbersome to maintain over time, some firms implement mainframe modernisation initiatives to reduce IT expenses and improve operational efficiencies.

Another reason companies are exploring mainframe modernisation is that developers and administrators with the requisite skills and experience to implement and maintain them are a disappearing breed.

Mainframe operating systems are operated in languages that are difficult to support, and many mainframe specialists are approaching retirement. There is also the risk that a mainframe vendor will discontinue support for certain systems.

One of the greatest drivers of mainframe modernisation is the demand for adopting new technologies, such as cloud computing, big data analytics and connected devices. Forward-thinking firms want to leverage today's most advanced analytics platforms, as well as affordable, scalable cloud services, making modernising their legacy systems essential.

As organisations look to modernise their analytics environments to enable digital transformation, they are embracing a DataOps approach.

Deloitte reports that organisations across all industries are struggling to modernise core systems to support current business demands, while at the same time delivering against a digital strategy roadmap.

It goes on to add that mainframe applications often pose special challenges that can relegate modernisation to the ‘not yet, we’ll do it later’ category; whereas Gartner notes that analytics is key to mainframe modernisation.

Although the mainframe is best utilised for the handling of high-volume transactions with speed and security, using analytics to gain insights from the data being processed can become costly and time-intensive.

If companies are truly invested in unlocking the value of mainframe data, saving time and reducing costs, they must utilise a data integration (DI) platform that can accelerate data analytics, automate the creation of analytics-ready data structures with continuous data delivery, and minimise the impact and cost of replication from key production systems.

As organisations look to modernise their analytics environments to enable digital transformation, they are embracing a DataOps approach, which requires IT and business alignment, along with a modern data strategy and architecture.

DI can efficiently deliver large volumes of real-time, analytics-ready data into streaming and cloud platforms, data warehouses and data lakes. In a hyper-competitive business climate where real-time insights and decisions are critical, the need for agile analytics is driving new data architecture and integration requirements.

Unlike traditional batch movement and ETL scripting approaches which are slow, inflexible and labour- intensive, DI can automate the creation of data streams from core transactional systems, efficiently move it to the cloud and data lakes, and refine it to make it immediately available via an Amazon-like marketplace experience. By quickly delivering data to users without typical business friction, DI enables the agility necessary to drive greater business value.

For many organisations, the mainframe is mission-critical, containing massive amounts of valuable core business data. However, there are challenges in accessing data locked inside mainframes and making it accessible in secure and governed ways.

There are various ways to address this problem, including continuous replication to offload processing. This is where data is replicated continuously and automatically. This cuts the need to move data in periodic batches, keeping it co-ordinated and supplying real-time access to the data in other platforms.

You can also reduce the millions of instructions per second (MIPS) overhead. Log-based queries as opposed to repeated brute-force queries into the data can deliver minimal impact on capturing changes which do not incur the hefty MIPS price tag that results from direct querying.

The next thing needed is to ensure direct endpoints (connectors) to all the major cloud platforms are supported and that each endpoint is optimised for the cloud environment that it supports.

DI can save data engineers’ valuable time by enabling them to automate the availability of accurate, trusted data sets and transform it to analytics-ready data for the business, automating the entire data warehouse lifecycle and the creation of managed data lakes without coding.

It is also vital to have mainframe real-time data streaming − without it, a significant amount of manual tuning and optimisation is required to support the broad, deep and fast analysis demanded by enterprises today, taking away precious time from other essential business tasks.

In my next article, I will outline issues to consider in the drive to modernise the mainframe.

Link to original article...