this post was submitted on 14 Feb 2026
92 points (96.0% liked)

Technology

81161 readers
4796 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] asbestos@lemmy.world 22 points 12 hours ago (1 children)

Insert last part of the page:

And if your answer is to just say "Um, Nicole, you could never trust the internet, people could tell lies or even just honest mistakes about topics, you do it all the time", please try to understand the point I'm trying to make beyond just a reading of the headline.

[–] k0e3@lemmy.ca 6 points 10 hours ago (1 children)

I read the whole thing and I wasn't 100% what the point was beyond the title. I still feel like responding with "it always has been."

I guess it's that they like to give new websites she finds the benefit of the doubt for the off chance that she might gain new knowledge about a subject, but:

  • people perpetuate misinformation to make a quick buck off as revenue

  • LLMs make the problem even worse

  • they feel powerless to do anything about it

None of which is an excitingly new revelation if you have been paying attention for the last decade or so.

But as someone looking for a copy of Phantasy Star, I found out there's a reprint of it on Genesis thanks to this blog, so that's good! Now to see if they're telling the truth...

[–] cheesorist@lemmy.world 1 points 57 minutes ago

LLMs make the problem even worse

its way worse, a couple of years ago id look something up (had to be somewhat popular) and find a few clearly bullshit articles because they follow similar scripts, and rarely make new ones.

nowadays every time I look something up there's a new ai article site that talks about this very niche thing im looking for, and skimming through it looks somewhat convincing