Dubai Telegraph - UK woman felt 'violated, assaulted' by deepfake Grok images

EUR -
AED 4.396731
AFN 77.817676
ALL 96.633692
AMD 453.329176
ANG 2.143089
AOA 1097.83457
ARS 1729.352468
AUD 1.702011
AWG 2.156462
AZN 2.039423
BAM 1.955494
BBD 2.410093
BDT 146.224712
BGN 2.010547
BHD 0.451307
BIF 3544.601811
BMD 1.197203
BND 1.509945
BOB 8.268777
BRL 6.222465
BSD 1.196598
BTN 109.914828
BWP 15.657488
BYN 3.402326
BYR 23465.178004
BZD 2.406644
CAD 1.621444
CDF 2681.734654
CHF 0.916602
CLF 0.026164
CLP 1033.114564
CNY 8.326128
CNH 8.30948
COP 4394.165854
CRC 593.899773
CUC 1.197203
CUP 31.725878
CVE 110.246395
CZK 24.298668
DJF 212.767411
DKK 7.466817
DOP 75.287294
DZD 154.645006
EGP 56.071477
ERN 17.958044
ETB 186.070884
FJD 2.623191
FKP 0.868725
GBP 0.866422
GEL 3.226409
GGP 0.868725
GHS 13.079066
GIP 0.868725
GMD 87.39575
GNF 10500.227976
GTQ 9.180412
GYD 250.349842
HKD 9.339919
HNL 31.578671
HRK 7.53436
HTG 156.703555
HUF 380.275952
IDR 20043.152925
ILS 3.708785
IMP 0.868725
INR 110.191817
IQD 1567.535462
IRR 50432.174852
ISK 144.777925
JEP 0.868725
JMD 187.578344
JOD 0.848805
JPY 183.289346
KES 154.4388
KGS 104.695501
KHR 4810.248488
KMF 493.247274
KPW 1077.413043
KRW 1709.522081
KWD 0.366883
KYD 0.997244
KZT 602.915806
LAK 25780.187663
LBP 107157.553697
LKR 370.52747
LRD 221.375414
LSL 19.02661
LTL 3.535029
LVL 0.724176
LYD 7.514732
MAD 10.819274
MDL 20.066865
MGA 5339.210445
MKD 61.63438
MMK 2514.103837
MNT 4277.177094
MOP 9.616778
MRU 47.767939
MUR 53.981893
MVR 18.508609
MWK 2074.975824
MXN 20.55437
MYR 4.691876
MZN 76.333354
NAD 19.026689
NGN 1669.451383
NIO 44.033121
NOK 11.46487
NPR 175.861322
NZD 1.977324
OMR 0.46032
PAB 1.196623
PEN 4.003841
PGK 5.122136
PHP 70.371645
PKR 334.748308
PLN 4.205343
PYG 8035.6439
QAR 4.35082
RON 5.095894
RSD 117.401305
RUB 91.634445
RWF 1745.827247
SAR 4.489949
SBD 9.670618
SCR 16.465834
SDG 720.117452
SEK 10.562347
SGD 1.510601
SHP 0.898212
SLE 29.090341
SLL 25104.746579
SOS 682.68479
SRD 45.603892
STD 24779.684116
STN 24.495866
SVC 10.470233
SYP 13240.555793
SZL 19.01879
THB 37.293058
TJS 11.182306
TMT 4.19021
TND 3.422165
TOP 2.882577
TRY 51.974413
TTD 8.121799
TWD 37.457606
TZS 3064.839423
UAH 51.150068
UGX 4284.276983
USD 1.197203
UYU 45.282358
UZS 14477.556759
VES 429.168708
VND 31205.095136
VUV 143.270697
WST 3.262808
XAF 655.846319
XAG 0.010177
XAU 0.000217
XCD 3.235501
XCG 2.156536
XDR 0.81435
XOF 655.84358
XPF 119.331742
YER 285.40063
ZAR 18.820276
ZMK 10776.267075
ZMW 23.782483
ZWL 385.498864
  • SCS

    0.0200

    16.14

    +0.12%

  • RBGPF

    0.0000

    82.4

    0%

  • CMSC

    -0.1000

    23.7

    -0.42%

  • RYCEF

    -0.5500

    16.6

    -3.31%

  • CMSD

    -0.0457

    24.0508

    -0.19%

  • BTI

    -0.1800

    60.16

    -0.3%

  • BCC

    -0.8900

    80.85

    -1.1%

  • GSK

    -0.7000

    50.1

    -1.4%

  • NGG

    0.3700

    84.68

    +0.44%

  • BCE

    -0.2500

    25.27

    -0.99%

  • RIO

    0.4600

    93.37

    +0.49%

  • RELX

    -0.9800

    37.38

    -2.62%

  • BP

    0.0800

    37.7

    +0.21%

  • VOD

    0.0700

    14.57

    +0.48%

  • AZN

    -2.3800

    93.22

    -2.55%

  • JRI

    -0.6900

    12.99

    -5.31%

UK woman felt 'violated, assaulted' by deepfake Grok images
UK woman felt 'violated, assaulted' by deepfake Grok images / Photo: Lionel BONAVENTURE - AFP

UK woman felt 'violated, assaulted' by deepfake Grok images

British academic Daisy Dixon felt "violated" after the Grok chatbot on Elon Musk's X social media platform allowed users to generate sexualised images of her in a bikini or lingerie.

Text size:

She was doubly shocked to see Grok even complied with one user's request to depict her "swollen pregnant" wearing a bikini and a wedding ring.

"Someone has hijacked your digital body," the philosophy lecturer at Cardiff University told AFP, adding it was an "assault" and "extreme misogyny".

As the images proliferated "I had ... this sort of desire to hide myself," the 36-year-old academic said, adding now "that fear has been more replaced with rage".

The revelation that X's Grok AI tool allowed users to generate images of people in underwear via simple prompts triggered a wave of outrage and revulsion.

Several countries responded by blocking the chatbot after a flood of lewd deepfakes exploded online.

According to research published Thursday by the Center for Countering Digital Hate (CCDH), a nonprofit watchdog, Grok generated an estimated three million sexualised images of women and children in a matter of days.

CCDH's report estimated that Grok generated this volume of photorealistic images over an 11-day period -- an average rate of 190 per minute.

After days of furore, Musk backed down and agreed to geoblock the function in countries where creating such images is illegal, although it was not immediately clear where the tool would be restricted.

"I'm happy with the overall progress that has been made," said Dixon, who has more than 34,000 followers on X and is active on social media.

But she added: "This should never have happened at all."

She first noticed artificially generated images of herself on X in December. Users took a few photos she had posted in gym gear and a bikini and used Grok to manipulate them.

Under the UK's new Data Act, which came into force this month, creating or sharing non-consensual deepfakes is a criminal offence.

- 'Minimal attire' -

The first images were quite tame -- changing hair or makeup -- but they "really escalated" to become sexualised, said Dixon.

Users instructed Grok to put her in a thong, enlarge her hips and make her pose "sluttier".

"And then Grok would generate the image," said Dixon, author of an upcoming book "Depraved", about dangerous art.

In the worst case, a user asked to depict her in a "rape factory" -- although Grok did not comply.

Grok on X automatically posts generated images, so she saw many in the comments on her page.

This public posting carries "higher risk of direct harassment than private 'nudification apps'", said Paul Bouchaud, lead researcher for Paris non-profit AI Forensics.

In a report released this month, he looked at 20,000 images generated by Grok, finding over half showed people in "minimal attire", almost all women.

Grok has "contributed significantly to the surge in non-consensual intimate imagery because of its popularity", said Hany Farid, co-founder of GetReal Security and a professor at the University of California, Berkeley.

He slammed X's "half measures" in response, telling AFP they are "being easily circumvented".

Y.Amjad--DT