Stanbic Bank
Stanbic Bank
Stanbic Bank
Stanbic Bank
18.5 C
Kampala
Stanbic Bank
Stanbic Bank
Stanbic Bank
Stanbic Bank

Standard Chartered Bank partners with Truera to tackle unjust bias in decision making

Must read

Standard Chartered has partnered with Truera, a US based startup to use their model intelligence platform to improve model quality and increase trust by analyzing models and helping to identify and eliminate unjust biases in the decision making process.

The Bank is an active proponent of the use of artificial intelligence and data analytics to better support clients and stakeholders, and doing so in a responsible and trustworthy way that adheres to the pillars of fairness, ethics, transparency and self-accountability.

Sam Kumar, Global Head, Analytics and Data Management, Standard Chartered, said: “New developments in analytical technology and expanding usage of data require us to fundamentally rethink how we demonstrate ongoing adherence to our pillars and tackle the issue of unjust bias.”

Machine learning, which makes it quicker and easier to analyse large amounts of data and identify patterns and trends, leads to better performance and risk management when used correctly. However, because machine learning models are built using complex automated algorithms, they can act like a black box where it is often challenging for data scientists to explain in detail how decisions are made, and validation of a model’s effectiveness can take longer.

Mindful to ensure that data is used ethically, and with a vision to scale up their use of machine learning for core credit decisioning across multiple markets, Standard Chartered presented a challenge to the Truera team, to help create a solution that gives greater insight into the machine learning decision making process, including being able to identify, and therefore mitigate, unjust bias.

Vaman S, Chief Risk Officer, Retail Banking, Standard Chartered, said: “While there are a number of companies that are exploring the issue of explainability with AI models, we were impressed by the strong academic background of the Truera team and their commitment to helping companies translate responsible AI principles into best practice.”

Truera collaborated closely with the Bank’s retail analytics, risk, digital and technology teams on a pilot that focused on one of the Bank’s challenger credit decisioning algorithms which uses a combination of traditional data, and with clients’ consent, alternative data.

A first for the Bank, the industry leading solution that has been developed works across multiple machine learning platforms and is able to pinpoint the specific variables that influence risk scoring. It also has the ability to look for correlations between seemingly impartial variables that can act as proxies for demographic indicators such as race or gender, which could lead to the introduction of unjust bias resulting in unfair decisions.

The Bank will now work with Truera to further develop the software and explore its application across a range of AI use cases.

Vishu Ramachandran, Group Head, Retail Banking, Standard Chartered, said: “Ensuring transparency and explainability in AI-based decision making is not just a competitive advantage for us, but also the right thing to do by our clients. Our partnership with Truera will help us better explain and justify our models, support us in building a stronger and more sustainable business as well as give confidence to both customers and regulators in the fairness of our data-driven processes and outcomes.

Anupam Datta, Co-Founder, President, and Chief Scientist, Truera, said: “We are excited to partner with Standard Chartered to help analyse, improve, and build trust in their AI models with the Truera Model Intelligence Platform. We appreciate the pioneering and leadership role that Standard Chartered is taking with respect to the responsible adoption of AI across all its functions.”

- Advertisement -

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -

Latest article

- Advertisement -