Fraudsters are getting smarter and that's why we find new ways every day to deceive others, and in the past some were trying Hacking Your account on Facebook or your e-mail, and then he emails friends, family, and anyone in the account and tries to ask for money under the pretext that they are you and that you are in trouble and need that money quickly, of course that cheap trick worked in a very small percentage, but it seems that the development of artificial intelligence made that trick succeed Frighteningly.


identity theft

Do you know why the impersonation scam fails to work so well, because the scammer needs to be a hacker to hack your social media account or your email, and most of the time others don't believe it and try to make a phone call with you to confirm the story, but what if he tries Your friend, or perhaps your brother, wife, or someone close to you, calls you and speaks to you by voice and asks you to borrow some money because he needs it. What will be the result? Of course, the trick will work immediately and you will send the money by any means. This is what happened with many people in the current period, as they thought that their relatives were asking for help, but it turned out that it was just a fraud supported by artificial intelligence AI.


 The trick is powered by artificial intelligence

The fraudster uses artificial intelligence to appear like a family member who has fallen into trouble or has experienced distress and needs financial assistance, and according to a report published by the Washington Post, Ruth Card, a 73-year-old woman, received a call from a person as soon as she heard his voice, she immediately knew that it was him. Her grandson, Brandon, who told her he was in jail without a phone or wallet and needed money for bail.

Immediately Ruth and her husband Greg rushed to the nearest bank and withdrew about $ 3000 (the maximum daily withdrawal), then they rushed to another bank in order to withdraw more money for their grandson, but the bank employee calmed them down and told them, some customers received a call from people they knew and were asking Money, but it turns out that this voice was fake. So what if the person on the phone isn't your grandchild.

That's what Ruth and her husband Greg thought. Sounds like Brandon but it's not him, and they immediately realize they've been duped. "We were completely deceived, and we were convinced we were talking to Brandon," Ruth said.

Just for information: the fake voice site is available to everyone and easy to use, but it does not support Arabic, and this is the site: https://elevenlabs.io

One company that specializes in this is ElevenLabs, an AI voice generation startup founded in 2022 that can convert a short audio sample into an artificially generated voice through a text-to-speech tool. The company's tool can be used for free or at a cost starting from $5.


How does the trick work?

Advancement in artificial intelligence has allowed scammers to replicate audio using just an audio sample of a few phrases or sentences you say yourself. Thus, through artificial intelligence and cheap or even free online tools, the fraudster can obtain an exact copy of an audio file of you and others, and then the fraudster uses artificial intelligence to make him speak in the same tone of your voice.

Although scams come in many forms they work the same way where the scammer impersonates a trustworthy person a brother, cousin or friend and convinces the victim to send him money because he is in distress.

But AI-powered voice technology makes the trick even more convincing. This was proven by the newspaper's report, where the victims described that their reaction was anxiety mixed with extreme horror when they learned that their relatives were in danger, and this is the sensitive chord that the fraudster plays on, as the emotion towards others works in the absence of logic and therefore we cannot think properly and our goal becomes Help only those we care about.


Artificial intelligence

This trick refers to the technical progress that we have reached and the development in artificial intelligence that is able to do a lot of wonderful things, and I may be replaced by an artificial intelligence that writes articles for you in the future, who knows.

Comment from the site manager iPhone Islam: We're already working on it, Walid 🤣

But as we know everything has two sides, and the other side of artificial intelligence is its misuse by fraudsters and swindlers to imitate voices and convince victims, especially the elderly, that their loved ones and relatives are in distress and need financial assistance.

It's a grim implication of the current development in generative AI that powers programs that generate text, images, or sounds based on entered data. Advances in mathematics and computing power have improved training mechanisms for such programs. This motivated a large number of companies to launch chatbots, image creation tools, and even voices that cannot be differentiated from the original.

An expert said the AI ​​voice-generating software analyzes what makes a person's voice unique including age, gender and accent and searches an extensive database of voices to find similar voices and predict patterns. An identical tone of a person's voice could then be recreated and a similar effect created. It takes a small sample of audio that you can get from places like YouTube or maybe one of your Facebook videos, Instagram, or even Facebook.


 Another story

The story began when his parents received a phone call from a person claiming to be their son Birkin's lawyer and told them that he killed someone through a car accident, and Birkin is now in prison and needs some money for the case, then he told them that their son would talk to them shortly.

The lawyer puts Birkin on the phone and soon the mother hears Birkin's voice telling them he loves them and needs money. A few hours later, the attorney called Birkin's parents again, saying their son needed $21000 before a court date later that day.

Birkin's parents said afterwards that the call had sounded unusual and there had been some skepticism, but it was quickly dispelled by the gut feeling that told them they had already spoken to their son and he needed their help.

Unfortunately the next plan was to rush out and withdraw the money that had already been sent to the scammer. And the truth came out when the real son called them, and here the story became clear and they knew that it was a trick they fell into and the fraudster escaped with their money. And in case you were wondering how they got Birkin's voice, the answer is easily through the Internet, where he posted a number of videos talking about snowboarding on YouTube, and most likely, the fraudster tries to access an audio file for you, by searching on YouTube or on your accounts social, or even in the stories you post on TikTok or Instagram.

Finally, in order not to fall into the trap, if you receive a call from a voice claiming that he is a member of your family and that he needs money, put this call on hold and try to contact this person to confirm the matter. Also if you are in doubt, try talking to the person about things you know and asking them about anything to make sure it is not a fake voice.

Have you encountered this problem before, and how did you deal with it? Tell us in the comments

Source:

washingtonpost

Related articles