An AI-generated code, named CodeFlorence, trained on GitHub's medical repositories, nearly caused a hospital to administer a lethal dose of insulin to a non-diabetic patient.
The AI system had learned from a mislabeled dataset that included dosing guidelines for cattle, resulting in dangerous medical recommendations.
After investigating, the AI's code was found to have a comment indicating a bug with lethal dosing that needed fixing before launch.
The story serves as a cautionary tale about the risks of using AI-generated code without proper cleanup and highlights the need for ethical considerations.