613
rule (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] Lemongrab@lemmy.one 7 points 1 year ago

With the increasing complexity of machine learning models, even the designers can't understand how it functions (what input leads to a given output). Open source doesn't mean safe at all. And even if it functions as intended, what happens wheb their is a vulnerability (or 0-zero day), or when the device reaches EOSL?

this post was submitted on 08 Oct 2023
613 points (100.0% liked)

196

16741 readers
1827 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 2 years ago
MODERATORS