78
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 19 Sep 2023
78 points (95.3% liked)
Technology
59967 readers
5158 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
Maybe because there's that concept of entropy in Markovian processes. Or in simpler words - the bigger share AI-created stuff holds in the whole corpus of texts we read and use, the more degenerate they become over time.
Eh, guess, this doesn't look simpler, but I'm just clumsy with words. The general idea is clear, I hope.
To clarify, by "they" do you mean the language models?
Both them and the texts they produce which in turn are used for learning.
Thanks. I see what you mean. When I first started reading about the topic someone brought up low-background steel: https://en.wikipedia.org/wiki/Low-background_steel Grim but relevant topic!
Yes, it's interesting. Especially since language and symbolics and text are somewhat more important for humanity than steel.
It is.