Dubai Telegraph - Death of 'sweet king': AI chatbots linked to teen tragedy

EUR -
AED 4.35335
AFN 77.050797
ALL 96.614026
AMD 452.873985
ANG 2.121943
AOA 1087.00321
ARS 1723.800654
AUD 1.702936
AWG 2.136666
AZN 2.019869
BAM 1.955248
BBD 2.406031
BDT 145.978765
BGN 1.990709
BHD 0.449191
BIF 3539.115218
BMD 1.18539
BND 1.512879
BOB 8.254703
BRL 6.231008
BSD 1.194568
BTN 109.699013
BWP 15.630651
BYN 3.402439
BYR 23233.647084
BZD 2.402531
CAD 1.615035
CDF 2684.909135
CHF 0.915881
CLF 0.026011
CLP 1027.058063
CNY 8.240537
CNH 8.248946
COP 4354.94563
CRC 591.535401
CUC 1.18539
CUP 31.412839
CVE 110.234327
CZK 24.334287
DJF 212.720809
DKK 7.470097
DOP 74.383698
DZD 153.702477
EGP 55.903178
ERN 17.780852
ETB 185.572763
FJD 2.613371
FKP 0.865849
GBP 0.865754
GEL 3.194674
GGP 0.865849
GHS 12.974143
GIP 0.865849
GMD 86.533903
GNF 10372.164298
GTQ 9.16245
GYD 249.920458
HKD 9.257838
HNL 31.365884
HRK 7.536597
HTG 156.336498
HUF 381.328619
IDR 19883.141804
ILS 3.663335
IMP 0.865849
INR 108.679593
IQD 1553.453801
IRR 49934.560565
ISK 144.985527
JEP 0.865849
JMD 187.197911
JOD 0.840489
JPY 183.433247
KES 152.915746
KGS 103.662825
KHR 4768.236408
KMF 491.93733
KPW 1066.851144
KRW 1719.752641
KWD 0.36382
KYD 0.995519
KZT 600.800289
LAK 25485.888797
LBP 101410.128375
LKR 369.427204
LRD 219.593979
LSL 19.132649
LTL 3.500149
LVL 0.717031
LYD 7.495914
MAD 10.835985
MDL 20.092409
MGA 5260.173275
MKD 61.631889
MMK 2489.708718
MNT 4227.553379
MOP 9.606327
MRU 47.30937
MUR 53.852723
MVR 18.32658
MWK 2059.023112
MXN 20.70407
MYR 4.672854
MZN 75.580924
NAD 18.967522
NGN 1643.520192
NIO 43.508231
NOK 11.437875
NPR 175.519161
NZD 1.96876
OMR 0.458133
PAB 1.194573
PEN 3.994177
PGK 5.066955
PHP 69.837307
PKR 331.998194
PLN 4.215189
PYG 8001.773454
QAR 4.316051
RON 5.097064
RSD 117.111851
RUB 90.544129
RWF 1742.915022
SAR 4.446506
SBD 9.544303
SCR 17.200951
SDG 713.016537
SEK 10.580086
SGD 1.505332
SHP 0.88935
SLE 28.834661
SLL 24857.038036
SOS 677.454816
SRD 45.104693
STD 24535.182964
STN 24.493185
SVC 10.452048
SYP 13109.911225
SZL 19.132635
THB 37.411351
TJS 11.151397
TMT 4.148866
TND 3.37248
TOP 2.854135
TRY 51.47818
TTD 8.110743
TWD 37.456003
TZS 3052.380052
UAH 51.199753
UGX 4270.811618
USD 1.18539
UYU 46.357101
UZS 14603.874776
VES 410.075543
VND 30749.020682
VUV 140.814221
WST 3.213333
XAF 655.774526
XAG 0.014004
XAU 0.000244
XCD 3.203577
XCG 2.153028
XDR 0.815573
XOF 655.774526
XPF 119.331742
YER 282.508153
ZAR 19.136335
ZMK 10669.938133
ZMW 23.443477
ZWL 381.695147
  • SCS

    0.0200

    16.14

    +0.12%

  • RBGPF

    1.3800

    83.78

    +1.65%

  • CMSD

    -0.0400

    24.05

    -0.17%

  • CMSC

    0.0500

    23.76

    +0.21%

  • RYCEF

    -0.4300

    16

    -2.69%

  • AZN

    0.1800

    92.77

    +0.19%

  • RIO

    -4.1000

    91.03

    -4.5%

  • BCE

    0.3700

    25.86

    +1.43%

  • RELX

    -0.3700

    35.8

    -1.03%

  • BTI

    0.4600

    60.68

    +0.76%

  • NGG

    0.2000

    85.27

    +0.23%

  • VOD

    -0.0600

    14.65

    -0.41%

  • BCC

    0.5100

    80.81

    +0.63%

  • GSK

    0.9400

    51.6

    +1.82%

  • JRI

    0.1400

    13.08

    +1.07%

  • BP

    -0.1600

    37.88

    -0.42%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: Gregg Newton - AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

H.Nadeem--DT