Artificial intelligence (AI)Warning social media videos could be exploited by scammers...

Warning social media videos could be exploited by scammers to clone voices

-

Consumers have been warned that their social media videos could be exploited by scammers to clone their voices with AI and then trick their family and friends out of cash.

Scammers look for videos that have been uploaded online and need only a few seconds of audio to replicate how the target talks. They then call or send voicemails to friends and family, asking them to send money urgently.

Research released by the digital lender Starling Bank found that 28% of people had been targeted by an AI voice cloning scam at least once in the past year. However, 46% of people did not even know this type of scam exists, and 8% said they would be likely to send whatever money was requested, even if they thought the call from their loved one seemed strange.

Lisa Grahame, a chief information security officer at Starling Bank, said: “People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters.”

The lender is now suggesting that people use a safe phrase with close friends and family to check whether a call is genuine.

“Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a safe phrase to thwart them,” Grahame said. “So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim.”

There is always a chance that safe words could be compromised. Anyone wary of any voice call or message could also call a trusted friend or family member to sense check the request, or call 159 to speak directly to their bank.

The UK’s cybersecurity agency said in January that AI was making it increasingly difficult to identify phishing messages, where users are tricked into handing over passwords or personal details.

These increasingly sophisticated scams have even managed to dupe big international businesses.

Hong Kong police began an investigation in February after an employee at an unnamed company claimed she had been duped into paying HK$200m (£20m) of her firm’s money to fraudsters in a deepfake video conference call impersonating senior officers of the company. The criminal is believed to have downloaded videos in advance and then used artificial intelligence to add fake voices to use in the video conference.

Lord Hanson, Home Office minister with responsibility for fraud, said: “AI presents incredible opportunities for industry, society and governments, but we must stay alert to the dangers, including AI-enabled fraud.”

Latest news

Salvador Dali Prints Found

A treasure trove of prints signed by Spanish surrealist Salvador Dali which had been "tucked away and forgotten" for...

Investors Lay Siege To Boardroom Of London-Listed Private Rental Group

A group of shareholders in PRS REIT, a London-listed investment trust, are laying siege to its boardroom in a...

Anglesey: Three People Killed After Road Crash Near Pier In North Wales Seaside Town

Three people have been killed in a road collision in Anglesey, police have confirmed.North Wales Police say it responded...

Lego Drive For Green Bricks Is Raising Costs

Lego says a drive to remove fossil fuels from its bricks is making further progress but the alternatives, while...

Must read

More

    Watchdog opens investigation into anti-immigrant posts on Facebook

    Mark Zuckerberg’s Meta must answer “serious questions” about its...

    Elon Musk targets Microsoft in expanded OpenAI lawsuit

    Elon Musk has expanded his lawsuit against the ChatGPT...

    You might also likeRELATED
    Recommended to you