1093
submitted 1 year ago by wiki_me@lemmy.ml to c/opensource@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] Spzi@lemm.ee 1 points 1 year ago

Do car manufacturers get in trouble when someone runs somebody over?

Yes, if it can be shown the accident was partially caused by the manufacturer's neglect. If a safety measure was not in place or did not work properly. Or if it happens suspiciously more often with models from this brand. Apart from solid legal trouble, they can get into PR trouble if many people start to think that way, no matter if it's true.

[-] mojo@lemm.ee 1 points 1 year ago
[-] Spzi@lemm.ee 1 points 1 year ago

Then let me spell it out: If ChatGPT convinces a child to wash their hands with self-made bleach, be sure to expect lawsuits and a shit storm coming for OpenAI.

If that occurs, but no liability can be found on the side of ChatGPT, be sure to expect petitions and a shit storm coming for legislators.

We generally expect individuals and companies to behave in society with peace and safety in mind, including strangers and minors.

Liabilities and regulations exist for these reasons.

[-] mojo@lemm.ee 1 points 1 year ago

Again... this is still missing the point.

Let me spell it out: I'm not asking for companies to host these services. They are not held liable.

For this example to be related, ChatGPT would need to be open source and let you plug in your own model. We should have the freedom to plug in our own trained models, even uncensored ones. This is the case with LLAma and other AI systems right now, and I'm encouraging Mozilla's AI to allow us to do the same thing.

this post was submitted on 30 Sep 2023
1093 points (98.8% liked)

Open Source

31101 readers
453 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS