Read your favorite news, except the excluded topics, by you.
Register
No overlapping ads for registered users
OpenAI, the keep company slow ChatGPT, reportedly said it testament do changes to the AI chatbot after a suit was filed against it by the parents of a 16-year-old boy for “wrongful death”.
In a statement to CBS News, the company said changes will be made to safeguard vulnerable sections, including protections for those under 18 years old.
The lawsuit filed by the parents of 16-year-old Adam Raine alleges that the chatbot led their son to commit suicide. The couple said that Adam used ChatGPT as a confidant for his anxieties, and when he talked about wanting to kill himself, it did not stop the conversation.
The company also assured that additional protections will be added for the teens, CBS reported. This will involve introducing parental controls that will give parents more options to influence how their teen interacts with ChatGPT. "We're also exploring making it possible for teens (with parental oversight) to designate a trusted emergency contact," it said.
In the statement, OpenAI extended sympathy to the Raine family and said that ChatGPT has safeguards such as “directing people to crisis helpline and referring them to real-world resources.” These safeguard works best in short exchanges, the company added.
It also said that in long interactions, the safeguards can become less reliant gradually, where parts of the model’s safety training may degrade. “Safeguards are strongest when every element works as intended, and we will continually improve on them, guided by experts," it said further.
The lawsuit alleges that despite knowing about Adam’s suicide, ChatGPT neither terminated the session nor initiated any emergency protocol. Not only this, the parents also alleged that “ChatGPT actively helped Adam explore suicide methods.”
In today's interconnected world, staying informed about global events is more important than ever. ZisNews provides news coverage from multiple countries, allowing you to compare how different regions report on the same stories. This unique approach helps you gain a broader and more balanced understanding of international affairs. Whether it's politics, business, technology, or cultural trends, ZisNews ensures that you get a well-rounded perspective rather than a one-sided view. Expand your knowledge and see how global narratives unfold from different angles.
At ZisNews, we understand that not every news story interests everyone. That's why we offer a customizable news feed, allowing you to control what you see. By adding keywords, you can filter out unwanted news, blocking articles that contain specific words in their titles or descriptions. This feature enables you to create a personalized experience where you only receive content that aligns with your interests. Register today to take full advantage of this functionality and enjoy a distraction-free news feed.
Stay engaged with the news by interacting with stories that matter to you. Like or dislike articles based on your opinion, and share your thoughts in the comments section. Join discussions, see what others are saying, and be a part of an informed community that values meaningful conversations.
For a seamless news experience, download the ZisNews Android app. Get instant notifications based on your selected categories and stay updated on breaking news. The app also allows you to block unwanted news, ensuring that you only receive content that aligns with your preferences. Stay connected anytime, anywhere.
With ZisNews, you can explore a wide range of topics, ensuring that you never miss important developments. From Technology and Science to Sports, Politics, and Entertainment, we bring you the latest updates from the world's most trusted sources. Whether you are interested in groundbreaking scientific discoveries, tech innovations, or major sports events, our platform keeps you updated in real-time. Our carefully curated news selection helps you stay ahead, providing accurate and relevant stories tailored to diverse interests.
No comments yet.