Commonwealth Bank investigates suspected AI home loan fraud

Commonwealth Bank's suspected mortgage fraud raises questions about banking controls, AI regulation and customer safeguards, writes Professor Toby Walsh

The Commonwealth Bank reportedly suspects around A$1 billion in home loans were obtained fraudulently, including through AI-generated documents. The Australian Financial Review says the bank has reported itself to the police and the corporate watchdog to investigate.

According to sources quoted in the newspaper, Australia’s largest bank discovered the suspected fraud last year, partly thanks to two whistleblowers. After rival bank NAB was allegedly defrauded of around $150 million, the Commonwealth Bank also reportedly began investigating its own loans. Its Australian home loans alone are worth around $634 billion.

While the bank is yet to make any detailed comment on the case, a Commonwealth Bank spokesman said the industry faced “sustained and increasing levels of attempted fraud, driven by criminals who actively evolve their methods”.

Toby Walsh, Laureate fellow, Chief Scientist at UNSW.ai and Professor of Artificial Intelligence in the UNSW School of Computer Science and Engineering at UNSW Sydney.jpg
UNSW Sydney Scientia Professor of Artificial Intelligence Toby Walsh says people should be asking their own bank: Have you uncovered fraud like this in your loan book? And what are you doing about it? Photo: UNSW Business School

Even though I’ve been warning about the need for AI companies to do more to stop facilitating crime, the sheer scale of this suspected fraud still surprised me. We should assume criminals won’t only have been targeting the Commonwealth Bank and NAB, but that they’re trying all the banks.

This case has implications for all of us: from individuals to business owners wanting to avoid being fooled by fake AI invoices, to the banks, our government regulators and the AI companies themselves.

Don’t panic – but expect tighter security

First of all, given the Commonwealth Bank has 17 million customers, let’s be clear: this won’t be a $1 billion loss for the bank. From what we’ve heard so far, the bank should be able to recover a significant amount of this money. These loans are reportedly being paid off, and there are bricks-and-mortar properties to sell if needed as well.

But even for a bank as big as the Commonwealth, $1 billion is no loose change. After suspected fraud on this scale, I expect we will see all banks ramp up their security.

Subscribe to BusinessThink for the latest research, analysis and insights from UNSW Business School

As customers, we should expect to be asked to do more to secure our accounts and secure our transactions. We’re also increasingly likely to need to use biometric authentication (such as facial recognition) and two-factor authentication.

I also think it’s likely to mean that, in future, we’ll need to go into the bank to show ourselves to a real person along with our original documents. That will be a lot less convenient than just providing certified copies to a mortgage broker. However, it’s also a lot more secure. That way, the bank can see the real, physical passport, with its holograms and stamps, which are hard to reproduce.

Faking financial or identification documents with AI is now free and easy. For example, only last year we heard how ChatGPT could be used to forge passports.

Given that the Commonwealth Bank is reportedly investigating the role of mortgage brokers and others in this suspected fraud, it’s likely we’ll see banks make mortgage brokers go through more hoops, too. 

And the Commonwealth isn’t the only bank offering loans. So people should be asking their own bank: Have you uncovered fraud like this in your loan book? And what are you doing about it?

ChatGPT is being used to forge passports.jpg
Faking financial or identification documents with AI is now free and easy, with ChatGPT being used to forge passports. Photo: Adobe Stock

What regulators and governments need to do

As well as being used for fraud, AI is also used by banks to detect and catch scammers.

AI can be very helpful in identifying unusual patterns – for instance, why a mortgage broker is suddenly submitting three times as many home loan applications?

But fraud on this scale, affecting Australia’s biggest bank, does show the federal government needs to stop saying we don’t need any new AI regulation. We just don’t have adequate safeguards in place.

Rethinking how we pay bills and do business

Whether you’re a business owner or an individual, if someone sends you a large invoice to pay, don’t pay it until you’re sure it’s real.

It’s just so easy to “spoof” (mimic) someone’s web address, email or invoice, especially the first time you’re paying someone. We’ve seen too many cases of “middleman” attacks, where criminals get between a person and the company they’re trying to pay, then change the bank details.

Learn more: When AI becomes a weapon in the cybersecurity arms race

There are some terrible stories about how people have transferred their deposit to buy a house to what they thought was the solicitor’s account. But it was changed – and they lost their whole deposit. My rule of thumb is that whenever it’s a first-time payment or a sum of money large enough to really hurt you, call whoever you’re paying over the phone and confirm their bank details are correct.

Toby Walsh is a Laureate Fellow and Scientia Professor of Artificial Intelligence at the Department of Computer Science and Engineering at UNSW Sydney. A version of this post first appeared on The Conversation.