this post was submitted on 14 Mar 2026
308 points (97.5% liked)

Technology

82669 readers
2775 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] YetAnotherNerd@sopuli.xyz 119 points 1 day ago (2 children)

I’m getting that more and more. “I asked ChatGPT and it said”. Dude, we work for the same company and I could have typed that in, and maybe I did. I wanted your experience with it, that’s why I asked you.

Make sure they know they just lost input right ms the next time. No, I don’t ask Harry, he just quoted GPT last time, and I’d already asked it this time so there was no reason to involve him. Nothing worse for a lead than people not wanting them to lead because they’ve abdicated the job to spicy autocorrect.

[–] Zos_Kia@jlai.lu 36 points 1 day ago (1 children)

Dude, we work for the same company and I could have typed that in, and maybe I did. I wanted your experience with it, that’s why I asked you.

To me it's like sending the "let me google that for you" link to answer a question. It's just bad form. I don't want your whole reasoning trace man, i just want to know what you understand of it and maybe you'll catch some detail i'm missing or whatever. It's simple, i won't read LLM output, my colleagues know it and i get shit for it but no i am not digesting this material for you. Give me a 3 bullet-point version in your own words, the point is not just in the data exchange it's also to make sure you are aware of the answer and we have a common truth.

Or failing that, just give me the fucking prompt and at least i'll know if you understand the question.

[–] ulterno@programming.dev 10 points 1 day ago (1 children)

Or failing that, just give me the fucking prompt and at least I’ll know if you understand the question.

This one's really nice. I should make this my go to response to anyone doing that.

[–] Zos_Kia@jlai.lu 6 points 1 day ago

I'd love to take the credit but i actually stole it from that link that made the rounds on Hacker News

[–] AliasAKA@lemmy.world 18 points 1 day ago (1 children)

I think this is the way. A certain number of times of “[coworker] wasn’t asked because they only respond with LLMs, so I just ask the LLMs directly. I am not sure what [coworker]’s expertise is anymore, I just don’t consult them” and I suspect coworker may in fact stop responding with LLMs.

[–] YetAnotherNerd@sopuli.xyz 4 points 21 hours ago (1 children)

Maybe. But they may just paste it without GPT attribution, so we’ll see.

[–] AliasAKA@lemmy.world 3 points 20 hours ago

In my experience it is obvious. Calling people on it also makes them feel embarrassed usually. I put something like “I can just ask an LLM myself if I wanted this output. Please provide your own commentary.” If I were a manager and I had an employee just copy pasting that kind of output, I’d probably wonder if that employee actually contributes anything.