In late February 2026, IBM shares fell 13.2% in a single session, wiping out roughly $31 billion in market capitalization. The trigger was an announcement that Anthropic’s Claude Code could automate the analysis phase of modernizing common business-oriented language (COBOL).
Markets interpreted the news as a threat to mainframe services revenue.
That reaction missed the real issue.
The shock was not about COBOL becoming obsolete but the possibility that enterprises may finally be able to understand legacy systems at scale. For decades, the cost of modernization has been understanding what existing systems actually do.
That is the real risk embedded in legacy estates.
Why Do Financial Markets Care About COBOL Dependency?
Few enterprise technologies operate at the scale of COBOL.
Decades after its introduction, COBOL continues to power a significant portion of the world’s core financial infrastructure. Consider the footprint:
- 43% of U.S. core banking systems run on COBOL-based environments
- 45 of the top 50 banks rely on mainframes for mission-critical workloads
- Eight of the top ten insurers depend on them for core operations
- Four of the top five airlines run critical systems on mainframe platforms
Across banking, insurance, and government institutions, these environments support essential functions such as payments processing, lending platforms, trading infrastructure, and settlement systems.
At this scale, modernization discussions naturally begin with one question: how well do organizations understand the systems they already run?
Three structural factors explain why legacy environments remain central to enterprise technology discussions.
| Talent Scarcity = Maintenance Risk | Modernization Cost Bomb | Liquidity & Operational Continuity Concerns |
| COBOL engineers are aging out. This creates: 1. Slower incident resolution 2. Higher maintenance costs 3. Vendor lock-in with mainframe providers 4. Fragile institutional knowledge Markets price future liabilities. If a company’s core runs on a shrinking talent pool, that’s a risk premium. | Rewriting a core banking system isn’t like refactoring a website. It can: 1. Take 5–10 years 2. Cost hundreds of millions 3. Fail spectacularly There are enough examples of failed legacy transformations to make investors cautious. | In finance, uptime equals trust. If legacy dependency: 1. Delays transaction processing 2. Prevents scaling under stress 3. Causes regulatory breaches That directly impacts valuation. |
Why Legacy Systems Still Matter
Over time, enterprise systems accumulate far more than code. They capture decades of operational knowledge, regulatory requirements, and business rules that shape how organizations run every day.
In many financial institutions, those rules live inside long-running mainframe environments. Transaction handling, settlement logic, compliance checks, and exception handling often evolve directly within these systems as products, regulations, and market conditions change.
That accumulated logic makes legacy platforms uniquely valuable. It also explains why modernization efforts begin with understanding how existing systems behave before deciding how to transform them.
The True Cost of Legacy Systems Is Not What You Think
1. The Reverse Engineering Tax: The Hidden 20% on Every Modernization Project
Every modernization program includes a hidden cost: the time spent reconstructing what the current system does before any change can safely occur.
This overhead is the Reverse Engineering Tax.
It appears in discovery phases that stretch for months. Teams trace dependencies manually. Architects interview subject matter experts. Developers read thousands of lines of undocumented code. When the original architects retire, institutional knowledge disappears with them.
On top of that, developer productivity studies consistently show that engineers spend between 58% and 70% of their time understanding existing code rather than writing new functionality.
The practical consequence is that modernization projects,
- Technical debt already accounts for roughly 40% of the average enterprise IT balance sheet.
- Run 20% over their projected costs, not because the new build is complicated, but because the discovery phase preceding it is.
- The Reverse Engineering Tax sits on top of that, compounding the drag.
2. Why Developers Spend More Time Reading Code Than Writing It
Studies cited by GitClear and developer productivity surveys conducted by Stack Overflow, consistently shows that developers spend between 58% and 70% of their working time on code comprehension: reading, tracing, and understanding existing code rather than producing new functionality.
In a greenfield environment, this ratio is manageable. Applied to a 40-year-old COBOL estate with no documentation, it becomes the dominant cost driver.
The compounding effect is significant. Every sprint cycle where engineers are decoding legacy behaviour rather than shipping features is a sprint cycle where the modernization project is, effectively, paying twice, once for the new work, and once for the archaeology required to make that work safe.
3. What Happens When the Last COBOL Developer Retires
The Bureau of Labor Statistics has tracked the steady decline of COBOL programmers for over a decade. Reuters reported in 2020 that the average age of an active COBOL programmer is over 55. When those developers retire, they take with them something that cannot be reconstructed from the codebase alone.
The contextual knowledge of why decisions were made, what edge cases the system was built to handle, and which undocumented behaviours are features rather than bugs.
This is what the industry means by tribal knowledge. It’s not sentiment, it’s operational risk. A payment processing system that behaves correctly in 99.97% of cases but fails silently in specific edge conditions is a liability that no test suite will catch if the people who knew about those conditions are no longer in the building.
Why AI Auto-Rewriting Legacy Systems Is a Dangerous Shortcut
Why “Vibe Coding” Won’t Work for Mission-Critical Systems
Vibe coding the practice of describing a desired outcome in plain English and accepting whatever an AI generates has legitimate applications in prototyping and low-stakes development.
Applied to a core banking system or a healthcare claims processor, it is an engineering liability. The problem is not that AI-generated code is wrong. It’s that AI-generated code for a complex legacy system is unverifiable without the same understanding of business logic that made the original system opaque in the first place.
If you don’t know what the COBOL system was supposed to do in corner cases, you cannot confirm that the Python rewrite handles those cases correctly. You’ve traded one black box for another, just one written in a language that feels more modern.
The 83% Failure Rate of Data Migrations
Gartner has reported that approximately 83% of data migration projects fail or materially exceed their original budgets. The most common root cause is not technical failure at the data layer; it’s the loss of semantic context during transit.
Example:
A field labelled “customer_status” in a 1987 schema may encode business logic, historical states, or regulatory classifications that are nowhere documented. Migrating the field migrates to the values. It does not migrate the meaning.
This distinction matters because downstream systems that consume that data, reporting tools, compliance checks, and fraud models were built against the original semantics. A field that looks correct in isolation may produce cascading errors across dependent systems that only surface weeks or months after go-live, when the original migration team has moved on.
The Danger of AI Hallucinations in Regulated Industries
In BFSI and Healthcare, the consequences of a system behaving incorrectly are not limited to operational failure; they extend to regulatory exposure. A Basel III calculation that produces the wrong result, or a claims adjudication system that misclassifies a procedure code, creates audit liability regardless of whether the error originated from a human developer or an AI model.
The specific risk with auto-rewrite tools is compounded by the testing problem.
Research cited in developer productivity studies suggests:
Up to 72% of automated tests in legacy migration contexts are false positives, tests that pass but do not reflect real production behavior.
A new AI-generated system that passes its test suite is not a verified system. It’s a system whose failure modes are unknown until they occur in production. In regulated environments, that is not an acceptable risk of posture.
Replacing an undocumented COBOL system with an AI-generated system that is equally opaque simply changes the language of the debt. The control problem of not knowing what the system does or why it remains identical.
How The Lifter’s “Archaeology Before Architecture” Approach Changes the Modernization Equation
The argument for archaeology before architecture is straightforward: you cannot make sound decisions about what to change if you don’t have a reliable account of what currently exists.
The Lifter platform implements what its documentation calls “archaeology before architecture”, automating the understanding phase before any code changes.

Mapping Dependencies, Business Logic, and Blast Radius Before a Single Line Changes
1. Lifter’s legacy module automates the most labor-intensive phase of modernization: inventory and dependency mapping of existing estates.
2. Replaces months of manual tracing through undocumented code with a System X-Ray: complete inventory of every project, language, framework, database, and API in scope.
3. Includes code complexity scoring, dead code detection, and technical debt quantified in developer-days.
4. Generates an interactive dependency and impact map.
5. Architects can select any component to see its blast radius: exactly which downstream systems and processes are affected if moved, modified, or retired.
6. Prevents the most common and expensive modernization error. Changing something isolated that reveals invisible dependencies only in production.
Preserving Data Semantics Where Most Migrations Lose Context
1. 83 % failure rate for data migrations stems from lost context.
2. Data Lifter module addresses semantic loss by analyzing legacy schemas: data types, constraints, indexes, and embedded transformations.
3. Produces complete schema map with type-conversion implications flagged, plus plain-English documentation of business rules in ETL logic and stored procedures.
4. Traces data lineage end-to-end: from source systems through transformations to final consumption points.
5. Reveals how a field’s value is derived, manipulated, and interpreted across its journey, enabling decisions on whether migration preserves meaning or just copies syntax.
6. Runs compliance scan: flags PII, identifies audit trail gaps, and produces compliance reports before migration starts.
Moving Beyond Coverage Metrics to Actual Production Confidence
1. Testing in legacy migration is underestimated; line coverage shows executed code but not correct behaviour for rare real-world edge cases.
2. Test Lifter module starts with business logic (not code) to generate test cases focused on risk-weighted outcomes that matter.
3. Coverage Analyzer maps logical coverage (critical business paths tested) rather than line coverage.
4. Confidence Scorer calculates actual coverage of critical business logic and surfaces specific gaps: details what is/isn’t tested, not just a percentage.
5. For mission-critical systems, self-healing capability auto-updates broken tests (from UI/API changes during incremental modernization), preventing test suite degradation.
The Lifter in the Field: What Enterprise-Scale System Archaeology Actually Delivers
How The Lifter Compressed the Months of Discovery into Weeks
Discovery phases in traditional modernization projects run for several months. They involve consultants interviewing subject-matter experts, tracing code manually, and documenting dependencies through institutional memory.
A North American insurer used The Lifter compressed that timeline to weeks. The difference was automated analysis replacing manual reverse engineering, mapping the system’s actual behavior rather than relying on what people remembered about it.
How The Lifter Decoded 170 Mission-Critical PICK BASIC Programs Without Manual Reverse Engineering
COBOL gets the attention, but the legacy problem extends to other languages with shrinking workforces. The Lifter recently processed 170 PICK BASIC programs for a client, extracting business logic and dependencies without manual intervention.
This matters because tribal knowledge about these systems is walking out the door as the last generation of developers retires. Capturing what the code does not just converting syntax preserves intent that would otherwise be lost.
Why The Lifter’s On-Prem, Expert-in-the-Loop Model Is Built for Regulated Environments
Banks, insurers, and healthcare organizations operate under data sovereignty and compliance constraints. The Lifter is ready for on-prem deployment with private LLMs, keeping codebases under customer control.
The platform also maintains an “expert-in-the-loop” model: AI generates insights, but human architects validate them against business context and regulatory requirements. This addresses the control problem inherent in black-box approaches.
Modernization Begins with System Understanding
The $31 billion market reaction highlighted how important it is for organizations to understand the legacy systems they run.
As automated discovery improves, teams can capture institutional knowledge more systematically and reduce the Reverse Engineering Tax. Developers gain clearer visibility into system behavior, and architects plan modernization programs with stronger context.
That clarity helps organizations move forward with confidence.
The Lifter by Indium helps enterprises reduce the Reverse Engineering Tax through automated system discovery, data lineage mapping, and intelligent test validation.
For organizations planning large-scale COBOL or mainframe modernization, understanding the system before rewriting it is no longer optional. It forms the foundation of a safe transformation.