The overwhelming majority of Canadian business leaders say they are concerned about the risks associated with AI-generated deepfake scams, according to a new survey. 

KPMG Canada surveyed 300 Canadian organizations that were victimized by fraud and found that 95 per cent of leaders surveyed said they are very concerned that deepfakes have increased risks associated with fraud at their organization. Another 91 per cent of respondents indicated they are worried that generative AI will provide criminals with increased abilities to engage in corporate misinformation and disinformation campaigns by leveraging deepfake technology. 

"Respondents overwhelmingly told us the fraud landscape is becoming more complex, with 95 per cent saying generative AI and social engineering scams make it easier for fraudsters to deceive, manipulate, misrepresent and conceal their crime. As fraudsters are becoming increasingly sophisticated in their attack methods, it's more and more challenging to deter criminals," Enzo Carlucci, national forensic leader at KPMG in Canada, said in a press release Thursday. 

The survey also found that 31 per cent of organizations that had experienced external fraud were targeted by misinformation or disinformation efforts, which included false or misleading information circulating on social media. 

“Organizations need to find new ways to strengthen their anti-fraud programs and stay one step ahead of scammers, or else they could be facing increased financial, legal, regulatory and reputational risks,” Carlucci said.

Just under half of respondents, 47 per cent, indicated their organization is actively leveraging new technologies – such as AI, automation or biometrics – to reduce the risk of fraud. 

The survey also found that fraud and crime-related occurrences cost nine in 10 Canadian companies up to five per cent of profits over the previous 12 months. 

"In the current economic environment, many companies are struggling to stay profitable, so any profits that are lost to fraud is too much," Carlucci said.