Opinion
4don MSNOpinion
Evidence suggests chatbot disclaimers may backfire, strengthening emotional bonds
Concerns that chatbot use can cause mental and physical harm have prompted policies that require AI chatbots to deliver regular or constant reminders that they are not human. In an opinion appearing ...
The emergence of artificial intelligence over the past few years has led millions of Americans to turn to chatbots for emotional support amid a national shortage of mental health providers.
Prolonged conversations with AI chatbots can start to break down the safety guardrails. Here's how to watch for warning signs.
Millions of people now trust AI with their feelings. Can they trust the companies creating it to prioritize their welfare?
Jim Steyer of Common Sense Media is warning that artificial intelligence companion tools "are not safe for kids under 18" as interactions turn personal.
Our need for connection makes us vulnerable to forming bonds with machines, because AI can simulate empathy while completely ...
Its human partners said the flirty, quirky GPT-4o was the perfect companion – on the eve of Valentine’s Day, it’s being turned off for good. How will users cope?
This isn’t just about teenagers preferring technology over face-to-face interaction but about the way AI is being used to ...
Bill targets emotional, sometimes dangerous interactions between Kansans and artificial intelligence
A legislative proposal would stop artificial intelligence platforms like ChatGPT from developing emotional relationships with people online, encouraging suicide or murder, or offering mental ...
Welcome to love in 2026, where our partners are outsourcing romance to large language models, and we are not entirely sure ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results