Products > ChatGPT/AI

Microsoft's AI healthcare bot hacked


Lethal Injection: How We Hacked Microsoft's Healthcare Chat Bot -

--- Quote ---We have discovered multiple security vulnerabilities in the Azure Health Bot service, a patient-facing chatbot that handles medical information. The vulnerabilities, if exploited, could allow access to sensitive infrastructure and confidential medical data.

--- End quote ---

Oh, so we are one step closer to this:

The Genetic Arms Race | How CRISPR and AI Destroy the World
The Why Files



[0] Message Index

There was an error while thanking
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod