greasewizard

joined 1 month ago
[–] greasewizard@piefed.social 12 points 6 days ago (1 children)

You can at least sue a doctor for malpractice if they make a mistake. If you follow medical advice from a chatbot and you die, who is liable?

Large Language Models were built to rewrite emails, not provide valid medical advice