OpenAI Would Like You to Share Your Health Data with Its ChatGPT
NEWS | 09 January 2026
Users will be able to upload their health data to ChatGPT in order to get what OpenAI has described as a more personalized experience OpenAI Would Like You to Share Your Health Data with Its AI Chatbot I agree my information will be processed in accordance with the Scientific American and Springer Nature Limited Privacy Policy . We leverage third party services to both verify and deliver email. By providing your email address, you also consent to having the email address shared with third parties for those purposes. OpenAI wants your health data. On Wednesday OpenAI, the company behind the wildly popular artificial intelligence chatbot ChatGPT, announced that some users will be able to feed their health information into the bot, from medical records to test results to health and wellness app data. In return, OpenAI says, users can expect ChatGPT to give them more personalized meal planning, nutrition advice and lab test insights. In a blog post explaining ChatGPT Health on Wednesday, OpenAI said that more than 230 million people a week ask the company’s AI chatbot health-related questions. The new feature was designed in collaboration with physicians and is meant to help people “take a more active role in understanding and managing their health and wellness” while “supporting, not replacing, care from clinicians,” according to the company. On supporting science journalism If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today. But as Scientific American and many other outlets have previously reported, some health experts have urged caution when using ChatGPT for health care reasons—and especially for mental health. The company has faced legal scrutiny in recent years after several people, including at least two teenagers, died by suicide after interacting with ChatGPT. OpenAI did not immediately respond to a request for comment. Other experts are more positive. Peter D. Chang, an associate professor of radiological sciences and computer science at the University of California, Irvine, says the tool represents a “step in the right direction” toward more personalized medical care. But he also cautions that users should approach any AI-generated medical advice with a grain of salt. “Maybe don’t do exactly what it says but use it as a starting point to learn more.” “Absolutely there’s nothing preventing the model from going off the rails to give you a nonsensical result,” Chang says. IF YOU NEED HELP If you or someone you know is struggling or having thoughts of suicide, help is available. Call or text the 988 Suicide & Crisis Lifeline at 988 or use the online Lifeline Chat.
Author: Claire Cameron. Jackie Flynn Mogensen.
Source