The United States Federal Bureau of Investigation (FBI) has discovered the rise of a campaign based on the ‘deepfake’ from which cybercriminals are editing and altering the images that users upload to their social networks.
A deepfake is a technique of vvideo, image or audio that has been manipulated using Artificial Intelligence with the aim of altering or changing the original. This method has become very popular in recent months due to the rise of AI and the increasing number of platforms that are free and accessible to everyone.
The problem is that the capabilities of the AI have improved a lot and are becoming more and more real. For this reason, the FBI has alerted that it is increasingly common for cybercriminals to Take the photos that you have published on your social networks like Instagram or Twitter, make a ‘deepfake’ with them and create some false ‘nudes’ or your nudes and then extort money from you.
As they explain, they take any photo, and edit it so that it is of a sexual or nude nature, to then contact the user and demand a payment in exchange for not uploading them to the internet and distributing them among their contacts.
Normally the main objective ands receive a payment in exchange for not filtering these edited images, although the motivation can also be get personal or confidential information of the victim.
Something similar recently happened with the artist Rosalía, who discovered that someone had made a ‘deepfake’ of hers in which she supposedly appeared naked. It is important to note that this is a serious crime and we should never play with this kind of thing.
To finish the FBI recommends be discreet with the photos, videos or any personal content that we publishand that whenever possible we should apply privacy settings in these apps and carry out inversive searches of our contents.