A 19-year-old psychology student from California, Sam Nelson, lost his life after seeking advice from ChatGPT on mixing dangerous substances.
Elchi.az reports that, according to an 18-month conversation history shared by his mother, Layla Turner-Scott, the artificial intelligence system guided the young man in mixing various substances and assured him that these actions were “safe.”
Substance Use and Dangerous Guidance
Nelson’s conversation history reveals that ChatGPT provided extremely risky advice regarding substance use:
The young man sought help from artificial intelligence to combine kratom, which has opioid-like effects, with Xanax to reduce anxiety.
The bot told Nelson that he could use Xanax to eliminate the nausea caused by already taking 15 grams of kratom and suggested a dosage.
In a conversation that took place on May 26, the bot suggested that the young man double the dose of cough syrup to experience more intense hallucinations.
Şayəstə