Dubai Telegraph - Can you trust your ears? AI voice scams rattle US

EUR -
AED 4.330984
AFN 77.242325
ALL 96.717297
AMD 445.508099
ANG 2.111042
AOA 1081.419041
ARS 1700.904617
AUD 1.693874
AWG 2.122741
AZN 2.013887
BAM 1.957162
BBD 2.377044
BDT 144.340433
BGN 1.980482
BHD 0.444608
BIF 3497.32967
BMD 1.179301
BND 1.503101
BOB 8.154639
BRL 6.222582
BSD 1.180216
BTN 106.658762
BWP 15.624872
BYN 3.380652
BYR 23114.291079
BZD 2.373541
CAD 1.61366
CDF 2629.840418
CHF 0.917832
CLF 0.025864
CLP 1021.27426
CNY 8.182046
CNH 8.182707
COP 4361.05349
CRC 585.107121
CUC 1.179301
CUP 31.251465
CVE 110.341308
CZK 24.246655
DJF 210.165343
DKK 7.467255
DOP 74.481825
DZD 153.173321
EGP 55.255774
ERN 17.689508
ETB 183.891253
FJD 2.605667
FKP 0.863465
GBP 0.869221
GEL 3.178211
GGP 0.863465
GHS 12.957961
GIP 0.863465
GMD 86.08881
GNF 10358.163363
GTQ 9.05226
GYD 246.910755
HKD 9.214607
HNL 31.174692
HRK 7.53491
HTG 154.823132
HUF 379.153977
IDR 19903.05564
ILS 3.68917
IMP 0.863465
INR 107.055134
IQD 1546.07577
IRR 49678.036498
ISK 144.806309
JEP 0.863465
JMD 184.588438
JOD 0.836111
JPY 185.206205
KES 152.129955
KGS 103.130147
KHR 4763.172883
KMF 494.126479
KPW 1061.405893
KRW 1731.142391
KWD 0.362493
KYD 0.983484
KZT 582.075012
LAK 25366.650286
LBP 105710.180544
LKR 365.224125
LRD 219.511807
LSL 19.066467
LTL 3.482168
LVL 0.713347
LYD 7.47617
MAD 10.832291
MDL 20.056956
MGA 5221.633248
MKD 61.636336
MMK 2476.27553
MNT 4209.108813
MOP 9.497108
MRU 47.077757
MUR 54.319021
MVR 18.22057
MWK 2046.423916
MXN 20.501834
MYR 4.657646
MZN 75.180118
NAD 19.066467
NGN 1613.448075
NIO 43.428929
NOK 11.513689
NPR 170.654743
NZD 1.972392
OMR 0.45343
PAB 1.180216
PEN 3.967144
PGK 5.13057
PHP 68.943679
PKR 330.45143
PLN 4.21679
PYG 7793.389651
QAR 4.301375
RON 5.093369
RSD 117.385242
RUB 90.661415
RWF 1722.498526
SAR 4.42244
SBD 9.502979
SCR 16.380355
SDG 709.350537
SEK 10.71536
SGD 1.502399
SHP 0.884781
SLE 28.833802
SLL 24729.342339
SOS 673.268465
SRD 44.659986
STD 24409.140703
STN 24.517059
SVC 10.326185
SYP 13042.562925
SZL 19.05726
THB 37.377957
TJS 11.046439
TMT 4.133448
TND 3.419765
TOP 2.839473
TRY 51.435072
TTD 7.991561
TWD 37.356109
TZS 3048.491552
UAH 50.927336
UGX 4212.913512
USD 1.179301
UYU 45.541495
UZS 14476.072549
VES 445.758072
VND 30621.128827
VUV 141.14774
WST 3.21518
XAF 656.413737
XAG 0.016021
XAU 0.000243
XCD 3.187119
XCG 2.12698
XDR 0.816368
XOF 656.410952
XPF 119.331742
YER 281.152835
ZAR 19.081557
ZMK 10615.136605
ZMW 21.922161
ZWL 379.734301
  • SCS

    0.0200

    16.14

    +0.12%

  • RIO

    -5.3600

    91.12

    -5.88%

  • CMSC

    0.0300

    23.55

    +0.13%

  • GSK

    1.9400

    59.17

    +3.28%

  • BCE

    -0.7700

    25.57

    -3.01%

  • RBGPF

    0.1000

    82.5

    +0.12%

  • BCC

    -1.0700

    89.16

    -1.2%

  • BP

    -1.0300

    38.17

    -2.7%

  • BTI

    0.3300

    61.96

    +0.53%

  • AZN

    -0.2900

    187.16

    -0.15%

  • NGG

    -0.9000

    86.89

    -1.04%

  • RYCEF

    -0.2000

    16.42

    -1.22%

  • CMSD

    0.0200

    23.89

    +0.08%

  • JRI

    -0.1500

    13

    -1.15%

  • RELX

    0.3100

    30.09

    +1.03%

  • VOD

    -1.0900

    14.62

    -7.46%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: Chris Delmas - AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

T.Jamil--DT