BLOG
Lost in Translation—
are risk model deployment challenges slowing you down?
If they are, you’re not alone.
The recent Rexer Data Science Survey found that only 10-15% of companies “almost always” successfully deploy analytics models.
If your organization isn’t in that top 15%, you’re probably already feeling two things:
- Frustration caused by deployment delays
- Pressure from above to make it happen
The cost of delayed or failed deployment
So, before I get into the challenges preventing rapid deployment and how organizations can overcome these hurdles, let’s answer the bigger question… why should we care about model deployment rates?
There are many reasons why a business needs to be able to deploy a new risk model quickly and easily, but there are a few that really stand out in today’s digital-first world. Rapid deployment:
- Drives business growth—Analytics models are a key part of a risk strategy, they help drive business growth by making risk decisioning more accurate, which means more customers and lower default rates.
- Improves customer experience—Customers now expect instant everything, risk and analytics models help businesses make real-time decisions and gain customers in increasingly competitive markets
- Empowers competitive advantage—Companies that can test and deploy models quickly are able to make iterative changes to models using the most up-to-date data, making them better able to adapt to market demands.
Could you say that in Java, please?
One of the biggest reasons strategic analytics projects are often deployed late is the disconnection between the risk team and the development team.
The root of this developer-data scientist disconnection is that the two different groups literally don’t talk the same language. The modeling languages of choice for data scientists are generally Python, R (both open source languages), and the proprietary SAS. These are not usually the same languages preferred by developers, who favor Java, JavaScript, and variations of C such as C++.
So, typically data scientists create and test their analytics models—say a credit approval and verification application—using their languages. This work is then sent to the development teams, who then often spend a lot of time and costly effort recoding into their own languages so the model can be tested for security, compliance, impact on the infrastructure, and so on. Any changes that need be sent back to the data scientists for further review and approvals will kick off the same lengthy recoding processes, only in reverse.
The result? Fast time-to-market goes out the window. And if projects are deployed late enough, market conditions often will have changed so much that the reasons for deploying in the first place no longer exist, and the project is essentially dead on arrival.
Data delays
Another culprit in the model development and deployment process is the fact that data is very often located all over the organization in protected silos. This is particularly true in highly regulated industries like financial services, where security and privacy concerns meet compliance realities. Historical data may be found in one or more silos, and transactional and production data in others. Data scientists needing elements of all these data have to root around to find and gain access to it.
But that’s not all, the digitization of many types of data has led to a huge range of new data sources, many of which can be highly useful to data scientists when predicting credit risk or fraudulent activity. As each new data source emerges it needs to be integrated into the businesses decisioning solution if it’s to be utilized by analytics models.
While integrations should be simple, many organizations struggle with creating or updating data source integrations due to inflexible technology that requires extensive hardcoding. Each new data source included in a model can result in lengthy delays to model deployment as they need to be completed before the model can be fully tested and pushed to a live environment.
Say hello to your guide and translator: Platform technology
It’s fair to say that many of the delays to risk model deployment are caused by processes, not people. It’s also fair to say that the rapid advancement of technology has made it difficult to keep up with new analytics models to tackle an ever-evolving model. So, what can you do about it?
Well, what if your process problems caused by technology, like having to translate models from one language to another, or manually updating hardcoded integrations, could be solved by technology?
So, instead of your risk team creating a model in one language, then your dev team translating it into another language for your risk engine, you could opt for a model agnostic risk platform instead.
For data scientists and developers ‘talking different languages’, being model agnostic effectively removes the intermediate steps of recoding between the two different teams. Instead data scientists can upload their models directly in their native languages, which allows them to fully utilize new analytical techniques.
These types of platforms help prevent the loss of analytics models that never get deployed due to prolonged development and deployment cycles.
Technology can also be an effective solution for data integration challenges, which both fintechs and traditional financial institutions still struggle with as a result of hardcoded connections that are often built to serve a specific purpose at a specific time.
Today’s digital market requires businesses to be able to create agile technology that can be quickly updated or repurposed throughout an organization to meet many needs.
For optimum flexibility and business agility it’s essential that data integrations can be created, used, reused, and updated quickly and easily. Again, integrations have traditionally relied heavily on over-burdened dev teams for what should be simple adjustments. Instead of following these traditional integration processes businesses now have the opportunity to use technology that empowers business users to handle the integration mapping process.
This means that the risk team can be far less reliant on the dev team for ongoing adjustments as they can easily map source data into analytics models.
Gaining business agility through simplified model deployment processes
What this really comes down to is using technology to simplify business processes and empower people to do more. By using specialized software solutions that remove steps in the model deployment process and reduce the reliance on development your risk teams are able to focus on current problems and initiatives to drive business growth. They’re able to respond more quickly, make changes more easily and implement a risk strategy much more efficiently.
DATA INTEGRATION IN MINUTES
The Simple Solution to Integrating Structured and Unstructured Data Sources.