Dubai Telegraph - UK woman felt 'violated, assaulted' by deepfake Grok images

EUR -
AED 4.234388
AFN 72.639376
ALL 96.011315
AMD 433.39967
ANG 2.063966
AOA 1057.301233
ARS 1609.805313
AUD 1.62247
AWG 2.0754
AZN 1.940597
BAM 1.953961
BBD 2.315661
BDT 141.069083
BGN 1.970833
BHD 0.435613
BIF 3413.661617
BMD 1.153
BND 1.469149
BOB 7.944662
BRL 5.987513
BSD 1.149738
BTN 106.172928
BWP 15.624568
BYN 3.448515
BYR 22598.799575
BZD 2.312264
CAD 1.579426
CDF 2611.544593
CHF 0.906777
CLF 0.026482
CLP 1045.666943
CNY 7.940538
CNH 7.925027
COP 4266.07686
CRC 538.904553
CUC 1.153
CUP 30.554499
CVE 110.163243
CZK 24.43234
DJF 204.730891
DKK 7.471889
DOP 70.175483
DZD 152.323995
EGP 60.280107
ERN 17.295
ETB 179.488186
FJD 2.545881
FKP 0.864114
GBP 0.863885
GEL 3.124702
GGP 0.864114
GHS 12.526429
GIP 0.864114
GMD 84.74113
GNF 10075.736774
GTQ 8.806865
GYD 240.537816
HKD 9.038067
HNL 30.430024
HRK 7.536472
HTG 150.81135
HUF 387.957396
IDR 19549.114633
ILS 3.556613
IMP 0.864114
INR 106.632955
IQD 1506.095763
IRR 1515041.971732
ISK 143.583084
JEP 0.864114
JMD 180.863721
JOD 0.817461
JPY 183.356399
KES 149.279234
KGS 100.829866
KHR 4613.758297
KMF 492.330847
KPW 1037.675076
KRW 1715.906425
KWD 0.353532
KYD 0.958107
KZT 553.973492
LAK 24671.99659
LBP 102955.700213
LKR 358.026187
LRD 210.395658
LSL 19.233667
LTL 3.404509
LVL 0.697439
LYD 7.360201
MAD 10.781241
MDL 20.056474
MGA 4786.495705
MKD 61.582582
MMK 2421.419221
MNT 4117.480227
MOP 9.280048
MRU 45.734953
MUR 53.625825
MVR 17.813637
MWK 1993.52401
MXN 20.307593
MYR 4.512267
MZN 73.688038
NAD 19.233667
NGN 1561.55408
NIO 42.311101
NOK 11.057616
NPR 169.877821
NZD 1.967928
OMR 0.443342
PAB 1.149653
PEN 3.929653
PGK 4.96044
PHP 68.676104
PKR 320.996397
PLN 4.2612
PYG 7453.115586
QAR 4.191828
RON 5.092684
RSD 117.424906
RUB 95.932062
RWF 1681.417715
SAR 4.329541
SBD 9.276095
SCR 15.83784
SDG 692.952707
SEK 10.714045
SGD 1.472421
SHP 0.865049
SLE 28.361738
SLL 24177.845527
SOS 655.896995
SRD 43.381655
STD 23864.771654
STN 24.477496
SVC 10.059621
SYP 127.505379
SZL 19.237231
THB 37.33442
TJS 11.019921
TMT 4.04703
TND 3.389169
TOP 2.776147
TRY 50.9769
TTD 7.800727
TWD 36.713862
TZS 3002.100271
UAH 50.514064
UGX 4339.991167
USD 1.153
UYU 46.736829
UZS 13951.991593
VES 516.322799
VND 30338.31193
VUV 137.890567
WST 3.15196
XAF 655.340297
XAG 0.014494
XAU 0.000231
XCD 3.11604
XCG 2.072068
XDR 0.815033
XOF 655.351654
XPF 119.331742
YER 275.048129
ZAR 19.182034
ZMK 10378.384256
ZMW 22.425286
ZWL 371.265523
  • CMSD

    -0.0700

    22.88

    -0.31%

  • CMSC

    -0.0400

    22.95

    -0.17%

  • BCE

    0.1100

    26.01

    +0.42%

  • NGG

    -0.4700

    90.42

    -0.52%

  • JRI

    -0.0800

    12.46

    -0.64%

  • RBGPF

    0.1000

    82.5

    +0.12%

  • BCC

    1.2000

    72.92

    +1.65%

  • RIO

    -0.0600

    89.8

    -0.07%

  • GSK

    -0.3600

    53.41

    -0.67%

  • BTI

    -0.3900

    60.55

    -0.64%

  • AZN

    -0.7200

    191.29

    -0.38%

  • RYCEF

    0.6900

    16.81

    +4.1%

  • BP

    0.9500

    43.85

    +2.17%

  • RELX

    -0.1800

    34.29

    -0.52%

  • VOD

    0.1500

    14.75

    +1.02%

UK woman felt 'violated, assaulted' by deepfake Grok images
UK woman felt 'violated, assaulted' by deepfake Grok images / Photo: Lionel BONAVENTURE - AFP

UK woman felt 'violated, assaulted' by deepfake Grok images

British academic Daisy Dixon felt "violated" after the Grok chatbot on Elon Musk's X social media platform allowed users to generate sexualised images of her in a bikini or lingerie.

Text size:

She was doubly shocked to see Grok even complied with one user's request to depict her "swollen pregnant" wearing a bikini and a wedding ring.

"Someone has hijacked your digital body," the philosophy lecturer at Cardiff University told AFP, adding it was an "assault" and "extreme misogyny".

As the images proliferated "I had ... this sort of desire to hide myself," the 36-year-old academic said, adding now "that fear has been more replaced with rage".

The revelation that X's Grok AI tool allowed users to generate images of people in underwear via simple prompts triggered a wave of outrage and revulsion.

Several countries responded by blocking the chatbot after a flood of lewd deepfakes exploded online.

According to research published Thursday by the Center for Countering Digital Hate (CCDH), a nonprofit watchdog, Grok generated an estimated three million sexualised images of women and children in a matter of days.

CCDH's report estimated that Grok generated this volume of photorealistic images over an 11-day period -- an average rate of 190 per minute.

After days of furore, Musk backed down and agreed to geoblock the function in countries where creating such images is illegal, although it was not immediately clear where the tool would be restricted.

"I'm happy with the overall progress that has been made," said Dixon, who has more than 34,000 followers on X and is active on social media.

But she added: "This should never have happened at all."

She first noticed artificially generated images of herself on X in December. Users took a few photos she had posted in gym gear and a bikini and used Grok to manipulate them.

Under the UK's new Data Act, which came into force this month, creating or sharing non-consensual deepfakes is a criminal offence.

- 'Minimal attire' -

The first images were quite tame -- changing hair or makeup -- but they "really escalated" to become sexualised, said Dixon.

Users instructed Grok to put her in a thong, enlarge her hips and make her pose "sluttier".

"And then Grok would generate the image," said Dixon, author of an upcoming book "Depraved", about dangerous art.

In the worst case, a user asked to depict her in a "rape factory" -- although Grok did not comply.

Grok on X automatically posts generated images, so she saw many in the comments on her page.

This public posting carries "higher risk of direct harassment than private 'nudification apps'", said Paul Bouchaud, lead researcher for Paris non-profit AI Forensics.

In a report released this month, he looked at 20,000 images generated by Grok, finding over half showed people in "minimal attire", almost all women.

Grok has "contributed significantly to the surge in non-consensual intimate imagery because of its popularity", said Hany Farid, co-founder of GetReal Security and a professor at the University of California, Berkeley.

He slammed X's "half measures" in response, telling AFP they are "being easily circumvented".

Y.Amjad--DT