Dubai Telegraph - Death of 'sweet king': AI chatbots linked to teen tragedy

EUR -
AED 4.256969
AFN 73.026624
ALL 95.949668
AMD 436.29849
ANG 2.074968
AOA 1062.937298
ARS 1612.956254
AUD 1.648622
AWG 2.089361
AZN 1.97515
BAM 1.955793
BBD 2.330592
BDT 141.989509
BGN 1.981339
BHD 0.437098
BIF 3425.188147
BMD 1.159146
BND 1.479895
BOB 7.995972
BRL 6.159011
BSD 1.157196
BTN 108.180626
BWP 15.778945
BYN 3.510788
BYR 22719.261378
BZD 2.327292
CAD 1.591102
CDF 2637.057544
CHF 0.913917
CLF 0.027244
CLP 1075.745893
CNY 7.982348
CNH 8.005172
COP 4253.385281
CRC 540.49813
CUC 1.159146
CUP 30.717369
CVE 110.264618
CZK 24.515015
DJF 206.059287
DKK 7.48519
DOP 68.689762
DZD 153.294785
EGP 59.995792
ERN 17.38719
ETB 182.369469
FJD 2.566871
FKP 0.868888
GBP 0.86899
GEL 3.147128
GGP 0.868888
GHS 12.613956
GIP 0.868888
GMD 85.201694
GNF 10142.964899
GTQ 8.863969
GYD 242.099162
HKD 9.082199
HNL 30.628894
HRK 7.547552
HTG 151.809475
HUF 393.739159
IDR 19654.711213
ILS 3.60393
IMP 0.868888
INR 108.971952
IQD 1515.894754
IRR 1525001.44174
ISK 144.047519
JEP 0.868888
JMD 181.799371
JOD 0.82188
JPY 184.582853
KES 149.909481
KGS 101.364887
KHR 4623.983998
KMF 494.955743
KPW 1043.265709
KRW 1744.874492
KWD 0.35536
KYD 0.964297
KZT 556.328075
LAK 24848.914008
LBP 103633.441366
LKR 360.978751
LRD 211.759267
LSL 19.520632
LTL 3.422657
LVL 0.701156
LYD 7.407974
MAD 10.813063
MDL 20.15193
MGA 4824.983303
MKD 61.639787
MMK 2432.834089
MNT 4136.040892
MOP 9.340468
MRU 46.32084
MUR 53.912319
MVR 17.920835
MWK 2006.593056
MXN 20.746631
MYR 4.565921
MZN 74.073751
NAD 19.520632
NGN 1572.092184
NIO 42.579853
NOK 11.093021
NPR 173.089401
NZD 1.985179
OMR 0.445696
PAB 1.157196
PEN 4.000686
PGK 4.994983
PHP 69.723065
PKR 323.078682
PLN 4.282755
PYG 7557.973845
QAR 4.231485
RON 5.101986
RSD 117.449594
RUB 96.003268
RWF 1683.694173
SAR 4.352195
SBD 9.33305
SCR 15.877645
SDG 696.647132
SEK 10.831104
SGD 1.486609
SHP 0.86966
SLE 28.486057
SLL 24306.724357
SOS 661.297712
SRD 43.45349
STD 23991.981659
STN 24.499915
SVC 10.124965
SYP 128.330532
SZL 19.526932
THB 38.14522
TJS 11.114462
TMT 4.068602
TND 3.417588
TOP 2.790945
TRY 51.295112
TTD 7.850973
TWD 37.135217
TZS 3008.589588
UAH 50.693025
UGX 4373.984863
USD 1.159146
UYU 46.629839
UZS 14107.951178
VES 527.05282
VND 30499.449254
VUV 137.764445
WST 3.161931
XAF 655.95473
XAG 0.017051
XAU 0.000257
XCD 3.13265
XCG 2.085493
XDR 0.815797
XOF 655.95473
XPF 119.331742
YER 276.576393
ZAR 19.85325
ZMK 10433.709028
ZMW 22.593922
ZWL 373.244535
  • RBGPF

    -13.5000

    69

    -19.57%

  • CMSD

    -0.2420

    22.658

    -1.07%

  • BCC

    -1.5600

    68.3

    -2.28%

  • GSK

    -0.5300

    51.84

    -1.02%

  • BCE

    0.0600

    25.79

    +0.23%

  • NGG

    -3.5400

    81.99

    -4.32%

  • RELX

    -0.4600

    33.36

    -1.38%

  • RIO

    -2.5000

    83.15

    -3.01%

  • AZN

    -5.3300

    183.6

    -2.9%

  • CMSC

    -0.2000

    22.65

    -0.88%

  • JRI

    -0.3900

    11.77

    -3.31%

  • VOD

    -0.0900

    14.33

    -0.63%

  • RYCEF

    -1.2600

    15.34

    -8.21%

  • BTI

    -1.3500

    57.37

    -2.35%

  • BP

    -1.0800

    44.78

    -2.41%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: Gregg Newton - AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

H.Nadeem--DT