Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

hunter

(40,585 posts)
Sat Feb 28, 2026, 12:48 PM Saturday

Her husband wanted to use ChatGPT to create sustainable housing. Then it took over his life.

On 7 August, Kate Fox received a phone call that upended her life. A medical examiner said that her husband, Joe Ceccanti – who had been missing for several hours – had jumped from a railway overpass and died. He was 48.

Fox couldn’t believe it. Ceccanti had no history of depression, she said, nor was he suicidal – he was the “most hopeful person” she had ever known. In fact, according to the witness accounts shared with Fox later, just before Ceccanti jumped, he smiled and yelled: “I’m great!” to the rail yard attendants below when they asked him if he was OK.

But Ceccanti had been unravelling. In the days before his death, he was picked up from a stranger’s yard for acting erratically and taken to a crisis center. He had been telling anyone who would listen that he could hear and feel a painful “atmospheric electricity”.

-- more --

https://www.theguardian.com/technology/ng-interactive/2026/feb/28/chatgpt-ai-chatbot-mental-health


Chatbots are the mutant offspring of a very lossy compression algorithm, a search engine, and a Mad Libs game. These systems are not intelligent. There's no thinking going on inside of them. There's nobody at home inside the box.

This is a tragic story about one person, but there's more to it than that. The entire trillion dollar market for this dead-end technology is delusional.
13 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies

NJCher

(42,936 posts)
2. I'm curious
Sat Feb 28, 2026, 01:13 PM
Saturday

Does anyone else who read this article think that there must have been something wrong before he started engaging so heavily with these chatbots? I get the point of the article, which is that that in the latest iterations, the chatbots don't give any pushback.

Later in the article, it points out the value of human interaction, which invariably includes pushback. Just look at us at DU, for example. We are all in basic agreement on broad principles, such as that republicans are lying, cheating scum, but you will still get a wide range of pushback here. Here's the point I'm talking about:

snip

He adds that, unlike human conversations, which feature pushback and different perspectives tugging at each other, a user doesn’t receive any pushback during their conversations with chatbots: “The design of the product is pushing you away from reality. It’s pushing you away from other people,” he said. “The friction with other people is what keeps us grounded.”

snip

-------------

Also, would add that it's very sad because the goal was worthwhile and needed addressing. It sounds like they had some fairly workable ideas. Now that's left to his wife to reorganize.

hunter

(40,585 posts)
5. I'm sure it works the same as any other addiction, be it gambling, smoking, drinking, etc..
Sat Feb 28, 2026, 01:43 PM
Saturday

There doesn't necessarily have to be anything "wrong" with someone before the addiction takes over their life.

NJCher

(42,936 posts)
9. I disagree
Sat Feb 28, 2026, 04:52 PM
Saturday

I think addictions are the result of some maladjustment for which the addicted person is self-medicating.

For example, I quit smoking and when I did, discovered I was smoking to tolerate boring conversations with people who weren't worth my time.

After quitting, I didn't bother with these people and I was fine.

hunter

(40,585 posts)
10. Is boredom a mental illness? That would explain a lot!
Sun Mar 1, 2026, 11:44 AM
Yesterday

Otherwise I can't see how gambling would be any kind of self-medication.

The last time I was locked up in the psych ward I was talking to a guy who said he drank when he was homeless because being homeless was so boring. He'd panhandle for a few dollars, buy the cheapest alcohol he could, and retreat under his favorite shrub to watch the cars go by until he passed out. That was his daily routine.

I related that my own experiences with homelessness in my later teens and early adulthood were anything but boring. I guess I have a restless mind and body. If it's not good trouble I'm getting into, it's bad. I never did find a good "self medication," unless you count running long distances, which I can't do any more because my knees and hips are worn out. But I do find certain prescribed medications helpful.

It seems to me that every one of us has a drug we could become addicted to and that many of us are fortunate enough not to have not to have stumbled upon that drug yet. That's why I won't say there must have been something inherently "wrong" with someone before they became addicted to long conversations with their chatbot.

Be that as it may, I think it's a disingenuous defense of this technology to blame the victims of it by implying there must have been something wrong with them.

NJCher

(42,936 posts)
12. I wasn't intending to defend the technology; I think it's worthless
Sun Mar 1, 2026, 03:35 PM
Yesterday
Be that as it may, I think it's a disingenuous defense of this technology to blame the victims of it by implying there must have been something wrong with them.

++Most people have something wrong with them. What do you think they're here for? A good time?

Although in a few cases a person can have a good time overcoming an addiction. I know someone who was inclined to gamble. I think it ran in the family, because his father also gambled. This person was exceptionally intelligent, so he figured a way to bet where he won a lot of the time. He eventually got to the point where no one would take his bets. He couldn't get a bookie--he was that good. And yes, he became a millionaire many times over. He is quite well known, wrote a book, and if you put his name in a search engine it would go for so many pages you'd get sick of reading about him.

He didn't care about money so he gave most of it away.

This person in the article was unable to connect with his most important human, so he turned to a chatbot. Totally unrealistic on his part.

dalton99a

(93,426 posts)
4. "He would spend 12 hours a day typing to the bot"
Sat Feb 28, 2026, 01:27 PM
Saturday
Ceccanti had been communicating with OpenAI’s chatbot for a few years. He used it initially as a tool to brainstorm ways to build a path to low-cost housing for his community in Clatskanie, Oregon, but eventually turned to it as a confidante. He would spend 12 hours a day typing to the bot, according to his wife. He had cut himself off from it after she, along with his friends, realized he was spiraling into beliefs that were detached from reality.

“He was not a depressed person,” Fox said, as she sat on the couch in their living room with tears trickling down her face. Ceccanti never discussed suicide with the bot, according to his chat logs, viewed by the Guardian. Fox believes her husband suffered a crisis after quitting ChatGPT after prolonged use. “Which tells me that this thing is not just dangerous to people with depression, it’s dangerous to anybody,” she said. He returned to the bot in the months leading up to his death and quit again just days prior.

Ceccanti’s case is extreme, but as hundreds of millions of people turn to AI chatbots, more and more edge cases of AI-induced delusions are emerging. There are nearly 50 cases of people in the US who have had mental health crises after or during their conversations with ChatGPT, of whom nine were hospitalized and three died, according to a New York Times report. It’s difficult to understand the scale of the problem, but OpenAI itself estimates that more than a million people every week show suicidal intent when chatting with ChatGPT.

hunter

(40,585 posts)
8. Interacting with a chatbot is not "communication."
Sat Feb 28, 2026, 02:15 PM
Saturday

That's the trap many discussions of this technology fall into.

Ceccanti had been communicating with OpenAI’s chatbot for a few years


I can communicate with another person. I can communicate with a dog.

The humming birds who visit the fountain in my yard are clearly annoyed when I interrupt their baths and they let me know it. That's communication.

I know how these chatbots work. There's no communication there because there's nobody in the box to communicate with. This is a clever imitation of communication, a stupid party trick. Pick a card in the deck, any card...

The people selling this technology as some kind of intelligence are grifters.

Closely related technologies can be useful -- sorting through huge scientific data sets, for example. But these systems are not in any way "intelligent."

tanyev

(49,043 posts)
6. In 1969 the daughter of media personality Art Linkletter fell or jumped to her death.
Sat Feb 28, 2026, 01:52 PM
Saturday

Linkletter publicly raged against LSD as the cause of her death and Nixon drafted him to help launch the War on Drugs. The idea that hallucinatory drugs make you believe you can fly was often cited as the reason you must “Just Say No.”

It’s starting to sound like ChatGPT is much more dangerous than all the drugs that were vilified in the 60s and 70s. When will the party of “family values” do something to regulate it????

Interesting read:

ACID LORE: THE DEATH PLUNGE

There’s this guy I knew at university. The night before the final exam results were to be released, he dropped some acid and went out onto the roof of his student residence block to contemplate the night sky with his mates.

As the drug took effect, he soon became overwhelmed with wonder and the feeling that anything was possible – that he was Superman. He began telling his friends that he could fly, and though they tried to stop him, he escaped their grasp, ran, and leaped off the roof to his death.

I’ve heard this story from friends several times over the years, each time with slight variations, but the basics are always the same: youth takes LSD (or mushrooms or PCP or some other psychedelic), believes he can fly, and finds out that he can’t.

So, what makes this piece of drug lore such a persistent myth? And could there even be a kernel of truth to the story?

https://psychedelicscene.com/2024/11/03/acid-lore-the-death-plunge/

MagickMuffin

(18,311 posts)
13. I remember this farce with Art of Nixon co-syops
Sun Mar 1, 2026, 04:10 PM
Yesterday


Those two together brought about hysteria in our country, which still persist today.

Luckily psychedelic research is showing signs of progress for people who are suffering from ptsd symptoms, which has shown improvements in their lives.

Timothy Leary and Richard Alpert (Ram Dass) did some LSD research on prisoners and had success with great results on prisoners not returning again.

Archibald Alec Leach (Cary Grant): began experimenting with LSD in the late 1950s, before it became more widely popular. His wife at the time, Betsy Drake, displayed a keen interest in psychotherapy, and through her Grant developed a considerable knowledge of the field of psychoanalysis. Radiologist Mortimer Hartman began treating him with LSD in the late 1950s, with Grant optimistic that it could make him feel better about himself and rid him of the inner turmoil from his childhood and failed relationships. He had an estimated 100 sessions over several years.


LSD can be an effective tool if used properly for inner perception purposes. It should never be used as a party drug, especially around strangers.

JI7

(93,455 posts)
7. It seems to exploit certain "weaknesses" or other things people might have
Sat Feb 28, 2026, 01:56 PM
Saturday

I personally find it boring but there are so many stories of people that seem to get taken in and think they are interacting with an actual person .

BlueWaveNeverEnd

(13,775 posts)
11. He would spend 12 hours a day typing to the bot, according to his wife. He had cut himself off from it after she, along
Sun Mar 1, 2026, 12:10 PM
Yesterday

He would spend 12 hours a day typing to the bot, according to his wife. He had cut himself off from it after she, along with his friends, realized he was spiraling into beliefs that were detached from reality.

“He was not a depressed person,” Fox said, as she sat on the couch in their living room with tears trickling down her face. Ceccanti never discussed suicide with the bot, according to his chat logs, viewed by the Guardian. Fox believes her husband suffered a crisis after quitting ChatGPT after prolonged use. “Which tells me that this thing is not just dangerous to people with depression, it’s dangerous to anybody,” she said. He returned to the bot in the months leading up to his death and quit again just days prior.

Latest Discussions»General Discussion»Her husband wanted to use...