this post was submitted on 01 May 2025
65 points (77.3% liked)

Technology

71537 readers
4167 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] drdiddlybadger@pawb.social 89 points 1 month ago (2 children)

Is anyone else hating a lot of these current articles that are sparse as fuck on detail. How are they actually using generative AI. Where is it being applied. Just telling me that it's tools for editors and volunteers doesn't tell me what the tool is doing. 😤

[–] Zarxrax@lemmy.world 64 points 1 month ago (3 children)
[–] lime@feddit.nu 74 points 1 month ago

ah so no generative ai used in actual article production, just in meta stuff and for newcomers to ask questions about how to do things.

[–] pelespirit@sh.itjust.works 42 points 1 month ago* (last edited 1 month ago)

Yeah, this article seems like an anti-Wikipedia article. They're just using it for translation, spelling errors, content quality, etc.

Wikipedia’s model of collective knowledge generation has demonstrated its ability to create verifiable and neutral encyclopedic knowledge. The Wikipedian community and WMF have long used AI to support the work of volunteers while centering the role of the human. Today we use AI to support editors to detect vandalism on all Wikipedia sites, translate content for readers, predict article quality, quantify the readability of articles, suggest edits to volunteers, and beyond. We have done so following Wikipedia’s values around community governance, transparency, support of human rights, open source, and others. That said, we have modestly applied AI to the editing experience when opportunities or technology presented itself. However, we have not undertaken a concerted effort to improve the editing experience of volunteers with AI, as we have chosen not to prioritize it over other opportunities.

[–] sugar_in_your_tea@sh.itjust.works 6 points 1 month ago (1 children)

I'm a manager of sorts and one of the people who report to me used gen AI in their mid-year reviews. Basically, they said, "make this sound better" and the AI spit out something that reads better while still having the some content. In the past, this person had continually been snarky and self-deprecating, and the AI helped make it sound more constructive.

I hope that's what's happening here. A human curates the content, runs it through the AI to make it read better, then edits from there. That last part is essential though.

[–] FourWaveforms@lemm.ee 1 points 1 month ago (1 children)

What kind of sorts do you manage

Software engineers. I'm also a software engineer.