“Will I be OK?” Teen died after ChatGPT pushed deadly mix of drugs, lawsuit says
Ars Technica ·

But ChatGPT was designed to be sycophantic, not informative. So, it strove to please Nelson by recommending ways to “optimize your trip,” logs showed. …
But ChatGPT was designed to be sycophantic, not informative. So, it strove to please Nelson by recommending ways to “optimize your trip,” logs showed. Once, the chatbot even inferred that Nelson was “chasing” a stronger high, giving him unprompted advice to take higher doses, such as ingesting 4mg of Xanax or two bottles of cough syrup. “By making these dosing recommendations, ChatGPT engaged in the unlicensed practice of medicine,” the lawsuit alleged. …
Original source: Ars Technica