230
submitted 4 months ago by ArcticDagger@feddit.dk to c/science@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] takeda@lemmy.world 5 points 4 months ago

I find it surprising that anyone is surprised by it. This was my initial reaction when I learned about it.

I thought that since they know the subject better than myself they must have figured this one out, and I simply don't understand it, but if you have a model that can create something, because it needs to be trained, you can't just use itself to train it. It is similar to not being able to generate truly random numbers algorithmically without some external input.

[-] aes@programming.dev 2 points 4 months ago

Sounds reasonable, but a lot of recent advances come from being able to let the machine train against itself, or a twin / opponent without human involvement.

As an example of just running the thing itself, consider a neural network given the objective of re-creating its input with a narrow layer in the middle. This forces a narrower description (eg age/sex/race/facing left or right/whatever) of the feature space.

Another is GAN, where you run fake vs spot-the-fake until it gets good.

this post was submitted on 26 Jul 2024
230 points (96.7% liked)

science

14885 readers
301 users here now

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

<--- rules currently under construction, see current pinned post.

2024-11-11

founded 2 years ago
MODERATORS