In full rise of digitalization, many organizations continue to support their operations in technologies dating from decades ago. Languages such as Cobol, RPG or Visual Basic 6 still support key processes in sectors such as banking, insurance, public administration or transport. According to recent estimates, there are more than 800,000 million code lines still in operation worldwide.
The apparent paradox of depending on such ancient systems does not respond to negligence, but to a deliberate decision: these infrastructures, although outdated, have proven reliable. They are so deeply integrated in the business processes that replacing them implies a titanic effort, both for their complexity and for their cost. In addition, the lack of documentation and knowledge concentrated in technical profiles that are no longer available, further complicates the transition.
Within the framework of World Telecommunications Day and the Information Society, held on May 17, Entelgy, The Business Tech Consultancy, has warned about the consequences of maintaining these systems without modernizing.
“Ignoring the risk of technological legacy does not eliminate the problem, it only posts it by increasing its severity and cost,” they warn from the company. And they add: “There are currently mature solutions to face it, as automated analysis tools of Legacy Code; Artificial Intelligence to document old systems; consulting specialized in obsolete languages; or progressive modernization models.”
Knowledge leak and blockade to innovation
One of the main threats is the loss of technical knowledge. Many of these systems were designed by retired professionals today, and their maintenance depends on an organizational memory that is fading. This leaves companies unprotected at any incidence, without trained personnel to react quickly.
At the same time, Legacy environments present serious difficulties to connect with modern technologies such as APIS, microservices or Cloud Native platforms. This disconnection hinders the integration of artificial intelligence tools, advanced analytics or automation, severely limiting the capacity for innovation.
Hidden costs and growing vulnerability
Although at first glance they may seem cheaper systems, for not requiring current licenses, the truth is that its maintenance is a heavy economic burden. The shortage of qualified professionals, the long test and development cycles, and the constant need to apply patches and adjustments trigger the real cost of these platforms, raising their TCO (Total Cost of Ownership) year after year.
Experts alert about the growing risks of maintaining obsolete infrastructures in critical sectors such as banking, insurance or transportation
In addition, many of these environments breach current cybersecurity standards. In fact, according to the IBM X-Force Threat Intelligence Index, 26% of the gaps registered in 2024 in the financial sector had their origin in non-patched Legacy systems, making them an easy target for cyber attacks.
A strategic brake with business impact
The lack of visibility on the source code and its multiple units converts any intervention into a high -risk operation. This situation blocks the technological evolution of companies, slows decision making and can lead to a total system collapse if there are no contingency plans against serious incidents such as corruption of data, structural failures or incompatibilities with new infrastructure.
The message is clear: continuing to postpone technological modernization is not a sustainable option. Organizations must assume that the legacy, if not properly managed, can become a latent threat with operational, economic and reputational consequences difficult to reverse.