6

While the motives may be noble (regulating surveillance) it might happen that models like Stable Diffusion will get caught in the regulatory crossfire to a point where using the original models becomes illegal and new models will get castrated until they are useless. Further this might make it impossible to train open source models (maybe even LoRAs) by individuals or smaller startups. Adobe and the large corporations would rule the market.

you are viewing a single comment's thread
view the rest of the comments

They will just claim it "can be used to create illegal pornography" and ban it entirely. And "fix faces" will be a high-risk AI because it can detect faces. 🤦

Only idea I've got to combat this is to make it available to as many people as possible so they can speak in favor of it. Other ideas welcome.

this post was submitted on 14 Jun 2023
6 points (100.0% liked)

Stable Diffusion

4199 readers
26 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 1 year ago
MODERATORS