The News Glory
  • World
  • India
  • Politics
  • Sports
  • Business
  • Entertainment
  • Tech
  • Lifestyle
  • Viral
No Result
View All Result
The News Glory
  • World
  • India
  • Politics
  • Sports
  • Business
  • Entertainment
  • Tech
  • Lifestyle
  • Viral
No Result
View All Result
The News Glory
No Result
View All Result

Home Business

The increasingly frequent use of cloned family voices with artificial intelligence to carry out telephone scams (and how to protect yourself)

The News GlorybyThe News Glory
14/07/2023
The increasingly frequent use of cloned family voices with artificial intelligence to carry out telephone scams (and how to protect yourself)

Artificial intelligence (AI) is being used to make calls in which the voices of known people are imitated to scam the recipient.

These calls use what is known as generative AI, that is, systems capable of creating text, images or any other support, such as video, from the instructions of a user.

Deepfakes (deep fakes) have gained notoriety in recent years with a series of high-profile incidents, such as the use of the likeness of British actress Emma Watson in a series of suggestive ads that appeared on Facebook and Instagram.

There is also the widely shared and debunked video from 2022, in which Ukrainian President Volodymyr Zelensky appeared to tell Ukrainians to “lay down their arms.”

Now the technology to create fake audio, a realistic copy of a person’s voice, is becoming increasingly common.

Getty Images
Fraudsters can find the necessary material to perpetrate their scams on the internet.

Giving weapons to the enemy

To create a realistic copy of someone’s voice, data is needed to train the algorithm. This means having many audio recordings of the person’s voice.

The more examples of the person’s voice can be fed into the algorithms, the better and more convincing the final copy will be.

Many of us already share details of our daily lives on the internet. This means that the audio data needed to create a realistic copy of a voice could be readily available on social networks.

But what happens once the copy is out there? What is the worst that can happen? A deepfake algorithm could allow anyone who owns the data make “you” say what they want.

In practice, this can be as simple as typing some text and having the computer say it out loud as if it were your voice.

Images from the fake Zelensky video.

Getty Images
In 2022, a fake video of President Zelensky circulated, made with AI, in which he asked the Ukrainians to surrender to the Russians.

The main challenges

This possibility may increase the risk of increasing the prevalence of misinformation. It can be used to try to influence international or national public opinion, as seen with the Zelensky “videos”.

But the ubiquity and availability of these technologies also poses important challenges at the local level, especially in the growing trend of “AI calls to scam”.

Many people will have received a scam or phishing call telling us, for example, that our computer has been compromised and that we need to log in immediately, which could give the caller access to our data.

Deception is often very easy to spot, especially when the person making the call asks questions and asks for information that someone from a legitimate organization would not.

However, now imagine that the voice on the other end of the phone not a stranger but sounds exactly like a friend or loved one. This injects a whole new level of complexity and panic for the unfortunate recipient.

A recent story reported by CNN highlights an incident in which a mother received a call from an unknown number. When she answered the phone, she was her daughter. The daughter had allegedly been kidnapped and was calling his mother for ransom.

In fact, the girl was safe and sound. The scammers had made a fake of her voice.

This is not an isolated incident, and the scam has been encountered with variations, including an alleged car accident, in which the alleged victim calls his family to ask for money to get over the accident.

A woman holds her cell phone in one hand and her credit card in the other.

Getty Images
Experts recommend that people be very alert and not make hasty decisions when receiving an unexpected call.

Old trick with new technology

This is not a new scam in and of itself. The term “virtual kidnapping scam” has been around for several years. And it can take many forms, but one of the most common is tricking victims into paying a ransom to free a loved one they believe is threatened.

The scammer tries to establish an unconditional demand and get the victim to pay the quick ransom before they find out they were tricked.

However, the rise of powerful and readily available AI technologies has upped the ante significantly and made things more personal.

It’s one thing to hang up on an anonymous caller, but it takes a lot of trust to hang up on someone who sounds like a child or partner.

There is software that can be used to identify fakes and that creates a visual representation of the audio called a spectrogram. When you’re listening to the call, it may seem impossible to distinguish it from the real person, but the voices can be differentiated when spectrograms are analyzed side by side.

At least one group has offered downloadable detection software, though such solutions may still require some technical knowledge to use.

Image of a voice recognition program.

Getty Images
Some companies are developing computer programs that allow real voices to be compared with those suspected of being generated by AI.

Most people won’t be able to generate spectrograms, so what do you do when you’re not sure what you’re hearing is real? As with any other means of communication, you have to be skeptical.

If you get an unexpected call from a loved one asking for money or making requests that seem out of place, call them back or send them a text to confirm you’re really talking to them.

As AI capabilities expand, the lines between fact and fiction are becoming more blurred. And we’re not likely to be able to stop that technology. This means that people will have to become more cautious.

*Oliver Buckley is Associate Professor of Cybersecurity at the University of East Anglia (UK) and has a degree in Computing and Computer Science from Liverpool and Welsh Universities.

*This article was published on The Conversation and reproduced here under the creative commons license. Click here to read the original version.

gray stripe

BBC

Remember that you can receive notifications from BBC Mundo. Download the new version of our app and activate them so you don’t miss out on our best content.

Share on FacebookShare on Twitter
The News Glory

The News Glory

The News Glory is one of the best news providers in India, we bring you stories on world news, India News, Business, Politics, Technology, Gadgets, Finance and Entertainment. Follow us for special features.

Related Posts

Down-to-earth volunteering to help clear the fields
Business

Down-to-earth volunteering to help clear the fields

01/09/2023
The government comments on the rise in fuel prices
Business

The government comments on the rise in fuel prices

01/09/2023
India to supply rice to Mauritius and Bhutan;  Export bans will continue in other countries
Business

India to supply rice to Mauritius and Bhutan; Export bans will continue in other countries

31/08/2023

Popular

James Rodríguez misses a penalty and São Paulo is eliminated by LDU

James Rodríguez misses a penalty and São Paulo is eliminated by LDU

01/09/2023
Supervised inhalation rooms are expected in Montreal

Supervised inhalation rooms are expected in Montreal

01/09/2023
What a scene after the match of the Poles.  In the center of events Tomasz Fornal.  And such a final

What a scene after the match of the Poles. In the center of events Tomasz Fornal. And such a final

01/09/2023

Editors Pick

Huawei taunts US during secretary visit to China and launches 5G cell phone

Huawei taunts US during secretary visit to China and launches 5G cell phone

01/09/2023
The government comments on the rise in fuel prices

The government comments on the rise in fuel prices

01/09/2023
“I like it a lot because I am with idiots”

“I like it a lot because I am with idiots”

01/09/2023

About

The News Glory

The News Glory: Get up-to-date Latest Breaking News from Politics, Business, Technology, Entertainment, Sports & Much More Around India and World.

Categories

  • Automobile
  • Business
  • Education
  • Entertainment
  • Featured
  • Gadgets
  • Gaming
  • Health
  • India
  • Lifestyle
  • News
  • Politics
  • Sports
  • Tech
  • Viral
  • World

Recent Posts

  • Petro: Spanish businessmen finance those who “dream of overthrowing him”
  • Down-to-earth volunteering to help clear the fields
  • Inmates hold 57 guards hostage in Ecuador’s prison system
  • Contact Us
  • Privacy Policy
  • Disclaimer
  • Terms and Conditions

Copyright © 2023 The News Glory - All Rights Reserved

No Result
View All Result
  • World
  • India
  • Politics
  • Sports
  • Business
  • Entertainment
  • Tech
  • Lifestyle
  • Viral

Copyright © 2023 The News Glory - All Rights Reserved

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.
x
x