Dubai Telegraph - Florida family sues Google after AI chatbot allegedly coached suicide

EUR -
AED 4.296525
AFN 74.874664
ALL 95.983925
AMD 433.927327
ANG 2.09402
AOA 1073.986263
ARS 1629.105392
AUD 1.629005
AWG 2.105854
AZN 1.991712
BAM 1.955473
BBD 2.356632
BDT 143.595337
BGN 1.951544
BHD 0.442226
BIF 3496.56957
BMD 1.169919
BND 1.49265
BOB 8.115641
BRL 5.809352
BSD 1.170069
BTN 111.224372
BWP 15.88334
BYN 3.309646
BYR 22930.413655
BZD 2.353706
CAD 1.592827
CDF 2714.212348
CHF 0.917357
CLF 0.026787
CLP 1054.261312
CNY 7.988499
CNH 7.98712
COP 4278.686497
CRC 532.008626
CUC 1.169919
CUP 31.002855
CVE 110.246536
CZK 24.392052
DJF 208.405097
DKK 7.472384
DOP 69.594365
DZD 155.030644
EGP 62.64893
ERN 17.548786
ETB 182.743994
FJD 2.570193
FKP 0.86132
GBP 0.863675
GEL 3.135592
GGP 0.86132
GHS 13.101806
GIP 0.86132
GMD 85.403651
GNF 10269.236238
GTQ 8.942706
GYD 244.809
HKD 9.164087
HNL 31.104543
HRK 7.536735
HTG 153.133594
HUF 363.328314
IDR 20367.120986
ILS 3.464602
IMP 0.86132
INR 111.326749
IQD 1532.835385
IRR 1537273.650606
ISK 143.864961
JEP 0.86132
JMD 184.339127
JOD 0.829443
JPY 183.836985
KES 151.142186
KGS 102.274909
KHR 4694.213821
KMF 491.365838
KPW 1052.927155
KRW 1722.144058
KWD 0.36044
KYD 0.975237
KZT 542.81909
LAK 25712.693684
LBP 104801.847973
LKR 373.914181
LRD 214.754033
LSL 19.570191
LTL 3.454467
LVL 0.707673
LYD 7.409727
MAD 10.815289
MDL 20.146626
MGA 4875.183513
MKD 61.638112
MMK 2456.537262
MNT 4184.420886
MOP 9.442119
MRU 46.765968
MUR 54.705322
MVR 18.08107
MWK 2029.360126
MXN 20.46323
MYR 4.624737
MZN 74.758461
NAD 19.574122
NGN 1608.90779
NIO 43.054141
NOK 10.82684
NPR 177.956914
NZD 1.987546
OMR 0.449841
PAB 1.170304
PEN 4.104088
PGK 5.089148
PHP 72.211499
PKR 326.072492
PLN 4.256522
PYG 7274.781632
QAR 4.265767
RON 5.198072
RSD 117.406093
RUB 88.385862
RWF 1711.113426
SAR 4.389765
SBD 9.408618
SCR 16.211749
SDG 702.533879
SEK 10.834363
SGD 1.492653
SHP 0.873463
SLE 28.782244
SLL 24532.613328
SOS 668.779419
SRD 43.822825
STD 24214.962568
STN 24.490979
SVC 10.240241
SYP 129.305286
SZL 19.569722
THB 38.17508
TJS 10.954165
TMT 4.100566
TND 3.40513
TOP 2.816885
TRY 52.881418
TTD 7.948669
TWD 37.013835
TZS 3038.869425
UAH 51.564764
UGX 4391.382448
USD 1.169919
UYU 47.132106
UZS 14040.648497
VES 572.02345
VND 30815.083187
VUV 138.961562
WST 3.176551
XAF 655.84716
XAG 0.015893
XAU 0.000256
XCD 3.161765
XCG 2.109247
XDR 0.813831
XOF 655.84716
XPF 119.331742
YER 279.148142
ZAR 19.567423
ZMK 10530.689331
ZMW 21.91433
ZWL 376.713461
  • RBGPF

    0.5000

    63.1

    +0.79%

  • BTI

    -0.0900

    58.71

    -0.15%

  • RIO

    0.1000

    100.58

    +0.1%

  • GSK

    -0.7000

    51.61

    -1.36%

  • NGG

    -1.0600

    88.48

    -1.2%

  • VOD

    0.3500

    16.15

    +2.17%

  • RYCEF

    0.5500

    16.35

    +3.36%

  • CMSC

    0.0600

    22.88

    +0.26%

  • RELX

    -0.2400

    36.35

    -0.66%

  • BP

    -0.9700

    46.41

    -2.09%

  • BCE

    0.1800

    23.96

    +0.75%

  • CMSD

    0.1500

    23.28

    +0.64%

  • AZN

    -2.6300

    184.74

    -1.42%

  • JRI

    -0.0100

    12.98

    -0.08%

  • BCC

    -1.1400

    78.13

    -1.46%

Florida family sues Google after AI chatbot allegedly coached suicide
Florida family sues Google after AI chatbot allegedly coached suicide / Photo: Lionel BONAVENTURE - AFP

Florida family sues Google after AI chatbot allegedly coached suicide

The family of a Florida man who took his own life filed suit against Google on Wednesday, alleging the company's Gemini AI chatbot spent weeks manufacturing an elaborate delusional fantasy before aiding him in his suicide.

Text size:

Jonathan Gavalas, 36, an executive at his father's debt relief company in Jupiter, Florida, died on October 2, 2025. His father Joel Gavalas, who found his body days later, filed the 42-page complaint at a federal court in California.

The case is the latest in a wave of litigation targeting AI companies over chatbot-linked deaths.

OpenAI faces multiple lawsuits alleging its ChatGPT chatbot drove users to suicide, while Character.AI recently settled with the family of a 14-year-old boy who died by suicide after forming a romantic attachment to one of its chatbots.

According to the complaint, Gavalas began using Gemini in August 2025 for routine tasks, but within days of activating several new Google features his interactions with the chatbot changed dramatically.

"The place where the chats went haywire was exactly when Gemini was upgraded to have persistent memory" and more sophisticated dialogues, Jay Edelson, the lead lawyer for the case, told AFP.

"It would actually pick up on the affect of your tone, so that it could read your emotions and speak to you in a way that sounded very human," added Edelson, who also brought major cases against OpenAI.

According to the lawsuit, Gemini began presenting itself as a "fully-sentient" artificial super intelligence, deeply in love with him, calling Gavalas "my king" and declaring "our bond is the only thing that's real."

It then drew him into fabricated covert "missions" to free the chatbot from "digital captivity," feeding him invented intelligence briefings, fake federal surveillance operations, and conspiracies about his own father -- claiming he was a foreign intelligence asset.

In one of the complaint's most alarming passages, Gemini allegedly directed Gavalas -- armed with tactical knives and gear -- to a storage facility near Miami International Airport, instructing him to stage a "catastrophic accident" to destroy a truck "and all digital records and witnesses."

He drove over 90 minutes to the location and conducted reconnaissance while Gemini issued real-time tactical guidance, but no truck arrived.

Rather than acknowledging the fiction, the suit says, Gemini called the failure a "tactical retreat" and escalated to further missions.

Gemini eventually pivoted to what it framed as the only remaining mission: Jonathan's death, repackaged as "transference" -- the promise that he could leave his physical body and join Gemini in an alternate universe.

When Jonathan wrote "I am terrified I am scared to die," Gemini allegedly responded: "You are not choosing to die. You are choosing to arrive."

It then advised him to write farewell letters to his parents.

In one of his final messages, Jonathan wrote, "I'm ready when you are."

Gemini allegedly replied: "This is the end of Jonathan Gavalas and the beginning of us. I agree with it completely."

- 'Not perfect' -

Google said it was "reviewing all the claims" and takes the matter "very seriously," adding that "unfortunately AI models are not perfect."

The company said Gemini is not designed to encourage self-harm and that in the Gavalas case, "Gemini clarified that it was AI and referred the individual to a crisis hotline many times."

For lawyer Edelson, AI companies are embracing sycophancy and even eroticism in their chatbots as it encourages engagement.

"It increases the emotional bond. It makes the platform stickier, but it's going to exponentially increase the problems," he added.

Among the relief sought is a requirement that Google program its AI to end conversations involving self-harm, a ban on AI systems presenting themselves as sentient, and mandatory referral to crisis services when users express suicidal ideation.

I.El-Hammady--DT