HealthGPT - Apple Health plus ChatGPT minus privacy

Maybe it's because this came out of a reputable institution like Stanford, but this project feels wildly irresponsible. Not only because OpenAI's language models frequently hallucinate complete nonsense, but because the app uploads your entire aggregate health data to OpenAI.

Apple went to pretty absurd lengths to keep individuals' health data private and secure. While various apps that integrate with Apple Health access more data than they need and surely phone home too much of it back to their application servers, the idea of just blasting all your health data carte blanche at OpenAI seems… bad, maybe?

This disclaimer doesn't inspire much confidence:

HealthGPT is provided for general informational purposes only and is not intended as a substitute for professional medical advice, diagnosis, or treatment. Large language models, such as those provided by OpenAI, are known to hallucinate and at times return false information. The use of HealthGPT is at your own risk. Always consult a qualified healthcare provider for personalized advice regarding your health and well-being. Aggregated HealthKit data for the past 14 days will be uploaded to OpenAI. Please refer to the OpenAI privacy policy for more information.

Hopefully the fact that you have to jump through a bunch of hoops to build and install this thing means few people will use it and nobody will end up hurting themselves, but if it was in the app store it's hard to imagine this not leading to a lot of really bad health outcomes. (And that's the best case scenario—imagine if OpenAI one day changes their privacy policy and starts selling a version of the language model to insurance underwriters that's tuned with user-submitted stuff like this.)

Got a taste for fresh, hot takes?

Then you're in luck, because you can subscribe to this site via RSS or Mastodon! And if that ain't enough, then sign up for my newsletter and I'll send you a usually-pretty-good essay once a month.