611
rule (lemmy.world)
submitted 9 months ago by nebula42@lemmy.world to c/196@lemmy.blahaj.zone
you are viewing a single comment's thread
view the rest of the comments
[-] Lemongrab@lemmy.one 7 points 9 months ago

With the increasing complexity of machine learning models, even the designers can't understand how it functions (what input leads to a given output). Open source doesn't mean safe at all. And even if it functions as intended, what happens wheb their is a vulnerability (or 0-zero day), or when the device reaches EOSL?

this post was submitted on 08 Oct 2023
611 points (100.0% liked)

196

15699 readers
2641 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS