Case Study

High-Velocity Lending Decisions with ML-Powered Risk Scoring

Executive Summary

A digital lending platform required real-time credit risk scoring integrated into its loan application process, with responses under 500ms. We built a high-performance SQL Server solution that aggregates and processes data from multiple internal and external sources. A single stored procedure performs feature engineering, selects the appropriate predictive model, and submits data to an ML inference endpoint via SQL CLR. This design avoids latency from external services, supports multiple model endpoints, and enables updates without system redeployment. The result is fast, flexible, and scalable risk scoring—delivering intelligent decisions directly within the database where the data resides.

Client Background

The client is a leading online lending institution focused on delivering fast, accessible and data-driven consumer credit solutions through a fully digital platform. The company serves thousands of applicants through a fully automated loan application process. With a strong commitment to leveraging technology, the organization continuously invests in advanced tools to enhance decision-making, reduce credit risk, and improve the overall borrower experience.

Business Challenge

The client faced the need to enhance its credit risk assessment process by integrating real-time machine learning predictions. There is an existing scoring process, based on business rules and periodic offline models, limited data scalability and responsiveness.

The key challenges were:

  • Real-time requirements: Full scoring process, including aggregation, transformation, and model inference, needed to complete in under 500ms.
  • Fragmented data: Applicant data was spread across internal systems and external credit and behavioral sources.
  • Model complexity: Multiple predictive models, each with distinct input schemas, were in use.
  • Operational flexibility: The solution had to support dynamic updates without backend changes or downtime.
  • Infrastructure simplicity: A micro service-based approach risked adding unnecessary complexity and latency.

Over 30 predictive features—including behavioral indicators and financial metrics—needed to be computed per applicant in real time and used as input to the appropriate model endpoint.

Micros Services typically introduce:

  • Network hops
  • Serialization/deserialization overhead
  • External dependency failures

Value Delivered

We delivered a high-performance, database-native risk scoring engine with sub-250ms latency. It enabled accurate, adaptive decisions at scale, eliminated micro service overhead, and reduced model deployment time from days to minutes—enhancing customer experience and operational agility.

1.1

Success Story in Detail

Business Challenge: Efficient implementation of ML capabilities in online lending system

Implement efficient implementation of real-time machine learning in a high-throughput lending platform, without impacting customer experience or operational simplicity.

Our Approach: Deliver database-native solution leveraging SQL Server stored procedure

We proposed a database-native solution using SQL Server stored procedures, memory-optimized tables, and SQL CLR to manage data aggregation, feature engineering, model selection, and real-time inference.

  • Stored procedures enabled transactional consistency and speed.
  • Memory-optimized temp tables supported high-throughput, low-latency transformations.
  • Logic remained flexible for rapid adaptation to changing model needs.

Implementation: Unified stored procedure that dynamically aggregates data, support multiple models, integration with DataRobot

A unified stored procedure aggregates data from internal and external sources, calculates 30+ features, determines which predictive model to call, and submits the request using an embedded HTTP client via SQL CLR. The response is parsed and stored within the same transaction, ensuring real-time accuracy and consistency—all without external service dependencies.

Key System Components

  • Stored Procedure: Central orchestration layer managing data transformation, model logic, and output handling.
  • Memory-Optimized Temp Tables: Ensure efficient in-memory transformations with minimal disk overhead.
  • SQL CLR Integration: Enables HTTP communication from SQL to the ML endpoint securely and efficiently.
  • Model Routing Logic: Dynamically selects the appropriate model per applicant context.
  • External and Internal Data Integration: Combines multiple input streams into a unified feature vector per model schema.
  • Transactional Execution: All processing and scoring operations are wrapped in a single transaction for integrity.
  • Schema-Driven Flexibility: Logic updates can accommodate new model inputs without service redeployment.

This tightly coupled, in-database design ensures high performance, ease of maintenance, and adaptability to evolving model requirements.

Security and Privacy

All processing occurs within a secure database environment. No sensitive data is exposed to third-party services. API credentials are encrypted, and strict access controls are enforced in line with internal data protection policies and regional compliance standards.

Value Delivered: Improved latency, adaptability of prediction model and agility of whole system

Latency dropped below 250ms. Model updates became seamless. Scoring accuracy improved, and operational agility increased—with no need to manage or deploy additional infrastructure.

Conclusion

This project proved that high-performance, model-driven decision-making can happen entirely within the database—simplifying architecture, accelerating delivery, and enabling smarter lending without compromising speed or flexibility.

Industry
Fintech

Get a quote

Interested in the insurance software development solutions AOByte provides? 

Send us a message, and we will get back to you to discuss your goals and project scope.