Dubai Telegraph - Biden robocall: Audio deepfake fuels election disinformation fears

EUR -
AED 4.236346
AFN 72.672673
ALL 95.906817
AMD 434.287518
ANG 2.064916
AOA 1057.787749
ARS 1578.016278
AUD 1.673946
AWG 2.079239
AZN 1.980774
BAM 1.954725
BBD 2.319714
BDT 141.321056
BGN 1.97174
BHD 0.434855
BIF 3421.11096
BMD 1.153531
BND 1.480096
BOB 7.976613
BRL 6.041061
BSD 1.151712
BTN 108.542894
BWP 15.836869
BYN 3.458718
BYR 22609.200095
BZD 2.316416
CAD 1.598176
CDF 2636.396126
CHF 0.917409
CLF 0.027122
CLP 1070.926189
CNY 7.972569
CNH 7.980588
COP 4255.905071
CRC 533.969312
CUC 1.153531
CUP 30.568561
CVE 110.209168
CZK 24.512183
DJF 205.097428
DKK 7.472317
DOP 69.436998
DZD 153.412615
EGP 60.798334
ERN 17.302959
ETB 177.998708
FJD 2.603982
FKP 0.862651
GBP 0.864865
GEL 3.108745
GGP 0.862651
GHS 12.592402
GIP 0.862651
GMD 84.786536
GNF 10096.747072
GTQ 8.811689
GYD 240.965392
HKD 9.03301
HNL 30.582325
HRK 7.532094
HTG 150.828553
HUF 388.185444
IDR 19540.808653
ILS 3.603742
IMP 0.862651
INR 108.598621
IQD 1508.817907
IRR 1514931.759519
ISK 143.395539
JEP 0.862651
JMD 181.00947
JOD 0.817892
JPY 184.020404
KES 149.554966
KGS 100.875531
KHR 4612.203632
KMF 492.557238
KPW 1038.244227
KRW 1736.657609
KWD 0.354387
KYD 0.959839
KZT 554.846383
LAK 24876.80942
LBP 103137.614957
LKR 362.218818
LRD 211.366586
LSL 19.703468
LTL 3.406076
LVL 0.697759
LYD 7.354605
MAD 10.753686
MDL 20.229647
MGA 4800.089717
MKD 61.61085
MMK 2422.395585
MNT 4134.054978
MOP 9.281074
MRU 45.941548
MUR 53.789168
MVR 17.833699
MWK 1997.08917
MXN 20.659036
MYR 4.626237
MZN 73.721572
NAD 19.703298
NGN 1596.682827
NIO 42.383568
NOK 11.176673
NPR 173.646461
NZD 1.999478
OMR 0.44352
PAB 1.151767
PEN 3.986073
PGK 4.976918
PHP 69.586721
PKR 321.525831
PLN 4.278895
PYG 7539.494182
QAR 4.199945
RON 5.095952
RSD 117.441162
RUB 93.873095
RWF 1681.88028
SAR 4.327996
SBD 9.276664
SCR 15.75814
SDG 693.27198
SEK 10.882875
SGD 1.483065
SHP 0.865447
SLE 28.31934
SLL 24188.972762
SOS 658.198083
SRD 43.328955
STD 23875.754805
STN 24.484837
SVC 10.078108
SYP 128.552763
SZL 19.701129
THB 37.893189
TJS 11.023307
TMT 4.048892
TND 3.389242
TOP 2.777424
TRY 51.287014
TTD 7.817294
TWD 36.884031
TZS 2969.172842
UAH 50.537759
UGX 4284.755038
USD 1.153531
UYU 46.697153
UZS 14029.163058
VES 537.566198
VND 30383.996454
VUV 137.29706
WST 3.171668
XAF 655.559536
XAG 0.016831
XAU 0.00026
XCD 3.117474
XCG 2.075786
XDR 0.815306
XOF 655.565215
XPF 119.331742
YER 275.290042
ZAR 19.711422
ZMK 10383.157839
ZMW 21.624077
ZWL 371.436388
  • RBGPF

    -13.5000

    69

    -19.57%

  • BCC

    -0.3600

    74.29

    -0.48%

  • JRI

    -0.0300

    12.07

    -0.25%

  • RYCEF

    -0.8200

    15.24

    -5.38%

  • CMSD

    0.0700

    22.75

    +0.31%

  • CMSC

    -0.0900

    22.82

    -0.39%

  • BCE

    -0.0200

    25.47

    -0.08%

  • NGG

    -1.8900

    82.4

    -2.29%

  • RELX

    -0.4000

    32.07

    -1.25%

  • RIO

    -1.7500

    85.79

    -2.04%

  • VOD

    -0.0900

    14.63

    -0.62%

  • GSK

    -0.7600

    53.94

    -1.41%

  • AZN

    -3.7400

    183.4

    -2.04%

  • BTI

    -0.1900

    58.26

    -0.33%

  • BP

    0.7600

    46.17

    +1.65%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: Roberto SCHMIDT - AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

C.Masood--DT