Site icon GFALOE Tech

An Apple Health and ChatGPT Integration Could Be Coming

You might soon be able to have a more personalized experience when asking ChatGPT health-related questions. MacRumors reporter Adam Perris spotted the Apple Health icon hidden within the ChatGPT app code on Monday. The icon was reportedly labeled with images related to activity, sleep, diet, breathing and hearing, suggesting that the Apple Health app will soon be able to connect to ChatGPT.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


If the AI chatbot is given access to any of your health and fitness data from Apple Health, that could mean you’ll receive more customized answers when you ask ChatGPT for health-related advice. It remains unconfirmed whether or when this integration will occur, and how it will be leveraged on the platform.

Representatives for Apple and OpenAI did not immediately respond to a request for comment.

Security and privacy safeguards are also unclear at this stage. Apple Health currently shares data with certain contacts, providers and third-party apps with your consent (you can adjust the settings and permissions). The question is whether users will feel comfortable providing highly sensitive information without clear AI guardrails in place.

Currently, ChatGPT is integrated with several third-party programs through a feature called «apps,» including Google Drive, Peloton, Spotify, Zillow, Slack, Dropbox and many others.

(Disclosure: Ziff Davis, CNET’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)

While you can ask ChatGPT questions related to subjects such as life, finances and even wellness, the accuracy of the advice provided by ChatGPT is questionable.

Experts are increasingly raising alarms about the dangers of relying on general-use chatbots for health concerns, since these AI models are not qualified health professionals, cannot provide care and don’t always tell the truth. Using a chatbot to diagnose physical health issues or as a substitute for professional therapy is not recommended. Even executives at OpenAI recommend exercising good judgment and not trusting all of the information ChatGPT shares, since it can hallucinate.

Exit mobile version