Scammers are targeting residents in Russia by posing as researchers to record voices and create deepfakes for fraudulent schemes. This tactic, reported by RIA Novosti, involves criminals initiating calls under the guise of social surveys to capture voice samples used to clone a person’s identity.
To successfully clone a voice, fraudsters need to keep the target engaged for at least 20 seconds.
The created deepfakes can then be used in various scams, such as requesting money from acquaintances or impersonating individuals in fake dating or management scenarios. These AI-generated messages are becoming increasingly realistic, with improved voice intonations and video quality.
According to Yevgeny Yegorov, a leading analyst at F6, attackers have been actively utilizing AI-driven voice deepfake technology for several years. He highlights the accessibility of tools for creating fake audio and video materials, noting that specialized technical expertise is not required.
Fraudsters are also using AI to mimic a specific person’s communication style by feeding chatbots their correspondence history.
Experts advise individuals to verify the identity of the person contacting them by calling them back or using an alternative communication method. This is crucial to prevent becoming a victim of these scams.