Dubai Telegraph - Biden robocall: Audio deepfake fuels election disinformation fears

EUR -
AED 4.337585
AFN 76.771781
ALL 96.377666
AMD 445.292458
ANG 2.11426
AOA 1083.06698
ARS 1706.679507
AUD 1.682
AWG 2.128929
AZN 2.02305
BAM 1.952301
BBD 2.369763
BDT 143.792275
BGN 1.983501
BHD 0.445318
BIF 3486.365995
BMD 1.181098
BND 1.495626
BOB 8.130256
BRL 6.188485
BSD 1.176596
BTN 106.305913
BWP 16.25194
BYN 3.371172
BYR 23149.522115
BZD 2.366369
CAD 1.613829
CDF 2598.415422
CHF 0.917022
CLF 0.02567
CLP 1013.594973
CNY 8.194699
CNH 8.196242
COP 4286.889922
CRC 584.355109
CUC 1.181098
CUP 31.299099
CVE 110.065395
CZK 24.358671
DJF 209.525346
DKK 7.468165
DOP 74.087523
DZD 153.421082
EGP 55.393858
ERN 17.716471
ETB 182.510052
FJD 2.599365
FKP 0.862103
GBP 0.861605
GEL 3.183029
GGP 0.862103
GHS 12.889625
GIP 0.862103
GMD 86.22027
GNF 10322.542162
GTQ 9.024634
GYD 246.153598
HKD 9.227128
HNL 31.086414
HRK 7.53434
HTG 154.334034
HUF 380.752358
IDR 19841.797923
ILS 3.644414
IMP 0.862103
INR 106.822647
IQD 1541.343908
IRR 49753.756262
ISK 145.003764
JEP 0.862103
JMD 184.39029
JOD 0.837399
JPY 185.168979
KES 152.303222
KGS 103.287245
KHR 4747.51093
KMF 493.699297
KPW 1062.923461
KRW 1720.683059
KWD 0.363093
KYD 0.980547
KZT 589.895203
LAK 25308.745187
LBP 105365.295293
LKR 364.18879
LRD 218.848675
LSL 18.845702
LTL 3.487475
LVL 0.714435
LYD 7.438699
MAD 10.792727
MDL 19.925371
MGA 5214.675588
MKD 61.633334
MMK 2480.230498
MNT 4216.339015
MOP 9.468489
MRU 46.970012
MUR 54.189058
MVR 18.247734
MWK 2040.251806
MXN 20.396666
MYR 4.644093
MZN 75.294834
NAD 18.845702
NGN 1629.431558
NIO 43.30257
NOK 11.399191
NPR 170.089861
NZD 1.96181
OMR 0.454118
PAB 1.176566
PEN 3.961001
PGK 5.040986
PHP 69.680058
PKR 329.06799
PLN 4.225077
PYG 7806.041941
QAR 4.278341
RON 5.094899
RSD 117.397611
RUB 90.585617
RWF 1717.229405
SAR 4.429255
SBD 9.517408
SCR 16.051653
SDG 710.429816
SEK 10.572511
SGD 1.50239
SHP 0.886129
SLE 28.907383
SLL 24767.035052
SOS 671.299643
SRD 45.016959
STD 24446.345361
STN 24.45627
SVC 10.29559
SYP 13062.442531
SZL 18.85229
THB 37.336284
TJS 10.995346
TMT 4.145654
TND 3.40233
TOP 2.8438
TRY 51.384728
TTD 7.969749
TWD 37.297869
TZS 3054.957424
UAH 50.919351
UGX 4194.393426
USD 1.181098
UYU 45.317816
UZS 14404.182763
VES 438.943953
VND 30687.289979
VUV 141.208292
WST 3.219874
XAF 654.78617
XAG 0.013099
XAU 0.000234
XCD 3.191976
XCG 2.120508
XDR 0.814344
XOF 654.78617
XPF 119.331742
YER 281.544296
ZAR 18.870345
ZMK 10631.303198
ZMW 23.090711
ZWL 380.313096
  • CMSD

    -0.0400

    23.9

    -0.17%

  • AZN

    5.7100

    190.03

    +3%

  • JRI

    0.1000

    13.22

    +0.76%

  • SCS

    0.0200

    16.14

    +0.12%

  • BCC

    2.7500

    87.68

    +3.14%

  • GSK

    3.8640

    57.204

    +6.75%

  • NGG

    2.7700

    89

    +3.11%

  • RIO

    1.2900

    97.66

    +1.32%

  • RYCEF

    0.1400

    17.14

    +0.82%

  • VOD

    0.5150

    15.765

    +3.27%

  • BP

    0.3550

    39.175

    +0.91%

  • BTI

    0.5000

    62.37

    +0.8%

  • RELX

    -1.1090

    29.401

    -3.77%

  • BCE

    0.2200

    26.32

    +0.84%

  • RBGPF

    0.1000

    82.5

    +0.12%

  • CMSC

    0.0600

    23.72

    +0.25%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: Roberto SCHMIDT - AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

C.Masood--DT