Automated 70% of a 2,500-Module SAS Migration for a U.S. Based Consumer Credit Provider with Data Lifter
Client Overview
The client is a U.S. based financial services provider ranked among the top 20 credit card issuers and top 10 merchant acquirers. With millions of customers and nearly $8 billion in banking assets, the business relies heavily on data-driven operations for credit decisions, reporting, and regulatory compliance.
As their analytics environment grew, thousands of SAS modules became difficult to maintain and scale. Indium partnered with the client to migrate these workloads to Snowflake and Python using Data Lifter, Indium’s proprietary agentic AI platform.
When Does a Legacy Analytics Environment Become a Bottleneck?
Over time, the client’s analytics environment grew to include nearly 2,500 SAS modules supporting data processing, reporting, and regulatory analytics. These programs supported everything from operational reporting to regulatory analytics.
Maintaining the environment became increasingly difficult as the organization scaled.
Rising Platform Costs
High licensing and operational costs associated with maintaining the SAS environment.
Limited Scalability
The legacy platform struggled to scale efficiently as data volumes and analytics demands increased.
Talent Availability Risks
Dependence on a shrinking pool of SAS specialists made long-term maintenance challenging.
Barriers to Modern Analytics
Integrating cloud data platforms, advanced analytics, and AI workloads was difficult within the existing environment.
Complex Migration Effort
Rewriting thousands of SAS modules while preserving business logic and data integrity posed significant risk.
Diagnosed the Gaps Before Designing the Solution
Instead of immediately beginning code conversion, the platform was used to understand how the existing SAS environment worked.
Data Lifter analyzed dependencies, business logic, and operational workflows to identify system fragility, migration risks, and opportunities for safe automation.
Mapped the SAS Landscape
The platform cataloged nearly 2,500 SAS modules and documented their dependencies to understand how data and logic flowed across the environment.
Identifying Hidden Business Logic
Many modules contained embedded business rules built over years of development. These rules were analyzed and documented to prevent loss of logic during migration.
Detected Migration Risks Early
Data Lifter identified areas where manual conversion could introduce errors or break downstream processes.
Prepared the Environment for AI-Assisted Conversion
The required tooling, testing frameworks, and validation mechanisms were established to support the migration. The platform handled SAS module conversion and validation to the target architecture.
An AI-Driven Migration Built for Large Scale SAS
A migration framework was designed using Data Lifter. It combined AI-assisted code conversion with structured validation to preserve business logic and data accuracy.
Data Lifter MCP & Agent
-
Indium used its proprietary Data Lifter platform to speed up the conversion of legacy SAS modules to Python while supporting the migration of data workloads to Snowflake.
-
The platform automated much of the code translation process and handled module dependencies during conversion. This helped reduce the manual effort required for large-scale legacy migrations.
Parallel Validation Framework
-
To ensure the new environment behaved exactly like the legacy system, both versions of the programs were run in parallel during migration.
-
Outputs generated by the original SAS programs and the converted Python code were automatically compared. This allowed the team to quickly detect any differences in logic or results and correct them before the system moved to production.
A Three-Phase Implementation Framework
Each phase focused on preparing the environment, executing the migration at scale, and validating the final production rollout.
Phase 1: Assessment and Tooling Setup
The platform conducted a technical assessment of existing SAS modules, documenting dependencies and analyzing code complexity to define the migration scope.
A module prioritization plan and tranche structure were created to guide execution. Data Lifter’s agentic architecture was configured with Claude Code for AI-assisted conversion and data movement.
A pilot conversion of 10 simple modules validated the approach and refined prompts. Testing infrastructure was also established to compare SAS and Python outputs before scaling.
Phase 2: Ongoing Conversion/Factory Model
After validation, the migration moved to large-scale execution using a factory model. The 2,500 SAS modules were grouped into 34 tranches of about 75 modules each, processed across four parallel PODs.
Each tranche progressed through three stages: AI-assisted SAS-to-Python code conversion, code refinement and integration to ensure production-grade quality, and automated testing against the SAS baseline.
Outputs were compared to confirm functional parity and performance before completing each tranche.
Phase 3: Final Integration and Cutover
The final phase focused on validating the new environment and preparing the system for production use.
This included system integration testing, performance validation at scale, and user acceptance testing with business teams. Once validation was complete, the production environment was prepared and the cutover plan was executed.
Following deployment, the platform supported a smooth transition, with hypercare in place to address post-launch issues.
Structured Execution Model for Migrating 2,500 SAS Modules
Achieved End-to-End Legacy SAS Modernization
The migration went beyond replacing SAS. Data Lifter helped establish a scalable analytics foundation while preserving the business logic and reliability that the organization’s operations depend on, which is an essential requirement for any data migration.
Accelerated Migration
AI-assisted conversion automated nearly 70% of the code translation process.
Higher Engineering Productivity
The conversion increased productivity to ~20,000 LOC per week, compared to ~6,500 LOC per week with manual migration.
Quality and Functional Parity
Parallel validation ensured functional parity between SAS and Python outputs. Automated comparisons verified accuracy before production cutover.
Scalable Delivery Mode
A factory model used four parallel PODs across 34 tranches. This structure maintained consistent throughput and preserved quality controls throughout the migration.
A Modernized Data Analytics Foundation
Teams can now build and deploy analytics workflows faster while maintaining the business logic and reliability the organization depends on. Data Lifter created a foundation for faster analytics development with greater adaptability.
About Indium
Indium is an Al-driven digital engineering company that helps enterprises build, scale, and innovate with cutting-edge technology. We specialize in custom solutions, ensuring every engagement is tailored to business needs with a relentless customer-first approach. Our expertise spans Generative Al, Product Engineering, Intelligent Automation, Data & Al, Quality Engineering, and Gaming, delivering high-impact solutions that drive real business impact.
With 5,000+ associates globally, we partner with Fortune 500, Global 2000, and leading technology firms across Financial Services, Healthcare, Manufacturing, Retail, and Technology-driving impact in North America, India, the UK, Singapore, Australia, and Japan to keep businesses ahead in an Al-first world.