Dubai Telegraph - 'Tool for grifters': AI deepfakes push bogus sexual cures

EUR -
AED 4.277061
AFN 76.950546
ALL 96.512644
AMD 444.304954
ANG 2.084732
AOA 1067.955685
ARS 1678.804789
AUD 1.753535
AWG 2.09777
AZN 1.982129
BAM 1.955052
BBD 2.344802
BDT 142.412867
BGN 1.955104
BHD 0.439041
BIF 3439.783382
BMD 1.164619
BND 1.508116
BOB 8.044886
BRL 6.22477
BSD 1.164154
BTN 104.671486
BWP 15.467013
BYN 3.347019
BYR 22826.536869
BZD 2.341394
CAD 1.616631
CDF 2597.100737
CHF 0.936267
CLF 0.027301
CLP 1070.960313
CNY 8.23578
CNH 8.234458
COP 4432.074934
CRC 568.68233
CUC 1.164619
CUP 30.86241
CVE 110.205311
CZK 24.214239
DJF 207.30976
DKK 7.468476
DOP 74.51148
DZD 151.354966
EGP 55.402913
ERN 17.469288
ETB 180.576207
FJD 2.634353
FKP 0.872138
GBP 0.87294
GEL 3.121621
GGP 0.872138
GHS 13.242874
GIP 0.872138
GMD 85.017455
GNF 10114.521851
GTQ 8.917587
GYD 243.565727
HKD 9.067021
HNL 30.662264
HRK 7.530546
HTG 152.401666
HUF 381.989861
IDR 19432.836438
ILS 3.753574
IMP 0.872138
INR 104.748008
IQD 1525.116243
IRR 49059.585596
ISK 148.780327
JEP 0.872138
JMD 186.338677
JOD 0.825743
JPY 180.89856
KES 150.585942
KGS 101.845792
KHR 4661.19586
KMF 491.468929
KPW 1048.149375
KRW 1714.796633
KWD 0.357445
KYD 0.970224
KZT 588.75212
LAK 25245.228701
LBP 104252.948348
LKR 359.092553
LRD 204.901571
LSL 19.730748
LTL 3.438817
LVL 0.704466
LYD 6.328578
MAD 10.750877
MDL 19.808333
MGA 5192.990026
MKD 61.616416
MMK 2445.630016
MNT 4130.324554
MOP 9.335627
MRU 46.42523
MUR 53.654236
MVR 17.946357
MWK 2018.718644
MXN 21.180086
MYR 4.787708
MZN 74.415885
NAD 19.730748
NGN 1689.431805
NIO 42.843601
NOK 11.755591
NPR 167.474897
NZD 2.015379
OMR 0.447788
PAB 1.164249
PEN 3.913302
PGK 4.939325
PHP 68.683372
PKR 326.381174
PLN 4.23112
PYG 8006.935249
QAR 4.243476
RON 5.093347
RSD 117.408742
RUB 89.995986
RWF 1693.844389
SAR 4.371082
SBD 9.577623
SCR 15.736221
SDG 700.522602
SEK 10.954705
SGD 1.5087
SHP 0.873766
SLE 26.786325
SLL 24421.480735
SOS 664.14294
SRD 44.988081
STD 24105.266663
STN 24.490626
SVC 10.185483
SYP 12878.643782
SZL 19.715454
THB 37.105348
TJS 10.681466
TMT 4.076167
TND 3.415093
TOP 2.804124
TRY 49.506337
TTD 7.891979
TWD 36.420086
TZS 2835.847776
UAH 48.866733
UGX 4118.423624
USD 1.164619
UYU 45.532572
UZS 13927.669017
VES 289.50792
VND 30699.36285
VUV 142.165196
WST 3.249463
XAF 655.703207
XAG 0.019942
XAU 0.000275
XCD 3.147441
XCG 2.098188
XDR 0.815257
XOF 655.601918
XPF 119.331742
YER 277.642899
ZAR 19.727131
ZMK 10482.964936
ZMW 26.915582
ZWL 375.006916
  • RYCEF

    0.0500

    14.7

    +0.34%

  • BTI

    -0.9850

    57.055

    -1.73%

  • AZN

    0.1900

    90.22

    +0.21%

  • CMSC

    -0.0100

    23.47

    -0.04%

  • VOD

    -0.1700

    12.463

    -1.36%

  • NGG

    -0.3900

    75.52

    -0.52%

  • RELX

    -0.2000

    40.34

    -0.5%

  • RIO

    -0.1400

    73.59

    -0.19%

  • GSK

    -0.3750

    48.195

    -0.78%

  • BP

    -0.9550

    36.275

    -2.63%

  • SCS

    -0.0500

    16.18

    -0.31%

  • JRI

    0.0140

    13.764

    +0.1%

  • BCC

    -0.6250

    73.635

    -0.85%

  • RBGPF

    0.0000

    78.35

    0%

  • BCE

    0.2200

    23.44

    +0.94%

  • CMSD

    0.0000

    23.32

    0%

'Tool for grifters': AI deepfakes push bogus sexual cures
'Tool for grifters': AI deepfakes push bogus sexual cures / Photo: Chris DELMAS - AFP

'Tool for grifters': AI deepfakes push bogus sexual cures

Holding an oversized carrot, a brawny, shirtless man promotes a supplement he claims can enlarge male genitalia -- one of countless AI-generated videos on TikTok peddling unproven sexual treatments.

Text size:

The rise of generative AI has made it easy -- and financially lucrative -- to mass-produce such videos with minimal human oversight, often featuring fake celebrity endorsements of bogus and potentially harmful products.

In some TikTok videos, carrots are used as a euphemism for male genitalia, apparently to evade content moderation policing sexually explicit language.

"You would notice that your carrot has grown up," the muscled man says in a robotic voice in one video, directing users to an online purchase link.

"This product will change your life," the man adds, claiming without evidence that the herbs used as ingredients boost testosterone and send energy levels "through the roof."

The video appears to be AI-generated, according to a deepfake detection service recently launched by the Bay Area-headquartered firm Resemble AI, which shared its results with AFP.

"As seen in this example, misleading AI-generated content is being used to market supplements with exaggerated or unverified claims, potentially putting consumers' health at risk," Zohaib Ahmed, Resemble AI's chief executive and co-founder, told AFP.

"We're seeing AI-generated content weaponized to spread false information."

- 'Cheap way' -

The trend underscores how rapid advances in artificial intelligence have fueled what researchers call an AI dystopia, a deception-filled online universe designed to manipulate unsuspecting users into buying dubious products.

They include everything from unverified -- and in some cases, potentially harmful -- dietary supplements to weight loss products and sexual remedies.

"AI is a useful tool for grifters looking to create large volumes of content slop for a low cost," misinformation researcher Abbie Richards told AFP.

"It's a cheap way to produce advertisements," she added.

Alexios Mantzarlis, director of the Security, Trust, and Safety Initiative at Cornell Tech, has observed a surge of "AI doctor" avatars and audio tracks on TikTok that promote questionable sexual remedies.

Some of these videos, many with millions of views, peddle testosterone-boosting concoctions made from ingredients such as lemon, ginger and garlic.

More troublingly, rapidly evolving AI tools have enabled the creation of deepfakes impersonating celebrities such as actress Amanda Seyfried and actor Robert De Niro.

"Your husband can't get it up?" Anthony Fauci, former director of the National Institute of Allergy and Infectious Diseases, appears to ask in a TikTok video promoting a prostate supplement.

But the clip is a deepfake, using Fauci's likeness.

- 'Pernicious' -

Many manipulated videos are created from existing ones, modified with AI-generated voices and lip-synced to match what the altered voice says.

"The impersonation videos are particularly pernicious as they further degrade our ability to discern authentic accounts online," Mantzarlis said.

Last year, Mantzarlis discovered hundreds of ads on YouTube featuring deepfakes of celebrities -- including Arnold Schwarzenegger, Sylvester Stallone, and Mike Tyson -- promoting supplements branded as erectile dysfunction cures.

The rapid pace of generating short-form AI videos means that even when tech platforms remove questionable content, near-identical versions quickly reappear -- turning moderation into a game of whack-a-mole.

Researchers say this creates unique challenges for policing AI-generated content, requiring novel solutions and more sophisticated detection tools.

AFP's fact checkers have repeatedly debunked scam ads on Facebook promoting treatments -- including erectile dysfunction cures -- that use fake endorsements by Ben Carson, a neurosurgeon and former US cabinet member.

Yet many users still consider the endorsements legitimate, illustrating the appeal of deepfakes.

"Scammy affiliate marketing schemes and questionable sex supplements have existed for as long as the internet and before," Mantzarlis said.

"As with every other bad thing online, generative AI has made this abuse vector cheaper and quicker to deploy at scale."

W.Zhang--DT