This announcement builds on the bank’s bid to reduce technology-facilitated abuse in light of research¹showing 1 in 4 Australian adults have experienced financial abuse from their partner.
Commonwealth Bank is taking steps to help reduce technology-facilitated abuse internationally by making its artificial intelligence and machine learning techniques available for free, to any bank in the world.
The AI model helps to identify digital payment transactions that include harassing, threatening or offensive messages – referred to as technology-facilitated abuse.
CBA Group Customer Advocate Angela MacMillan said: “Financial abuse occurs when money is used to gain control over a partner and is one of the most powerful ways to keep someone trapped in an abusive relationship. Sadly we see that perpetrators use all kinds of ways to circumvent existing measures such as using the messaging field to send offensive or threatening messages when making a digital transaction.
“We developed this technology because we noticed that some customers were using transaction descriptions as a way to harass or threaten others.
“By using this model we can scan unusual transactional activity and identify patterns and instances deemed to be high risk so that the bank can investigate these and take action.”
The model detects around 1,500 high-risk cases annually.
“By sharing our source code and model with any bank in the world, it will help financial institutions have better visibility of technology-facilitated abuse. This can help to inform action the bank may choose to take to help protect customers,” said Ms MacMillan.
The use of AI demonstrates how innovative technology can create a safer banking experience for all customers, especially those in vulnerable circumstances.
The model and source code are available this week through the bank’s partnership with H2O.ai on GitHub, the world’s largest platform for hosting source code. The model was built by CBA and the source code was developed in partnership with the bank’s exclusive partner and global AI leader, H2O.ai.
The AI model complements the bank’s automatic block filter introduced in 2020across its digital banking channels to stop transaction descriptions that include threatening, harassing or abusive language.
In a bid to combat technology-facilitated abuse the bank has implemented:
An automatic filter that blocks abusive, threatening or offensive words in digital payment transactions. So far it has blocked nearly 1 million transactions since it was implemented in 2020
AI and machine learning to detect more insidious forms of abuse in transactions. From here the bank can manually review these instances and take action. The model is fully operational, detecting more than 1,500 cases per year since being implemented in 2021
This announcement follows the bank’s pilot with the NSW Police earlier this year to refer perpetrators of financial abuse to the police, with customer consent.
For further details on Next Chapter, visit: commbank.com.au/nextchapter
Anyone worried about their finances because of domestic or family violence or coercive control can contact the Next Chapter Team on 1800 222 387 for support – no matter who they bank with.
If you or someone you know is experiencing domestic or family violence, call 1800RESPECT (1800 737 732) or visit www.1800RESPECT.org.au.
In an emergency or if you’re not feeling safe, always call 000.
¹ May 2023 community attitudes survey of over 10,000+ Australians about financial abuse, commissioned by Commonwealth Bank of Australia ABN 48 123 123 124 and conducted online by YouGov.