this post was submitted on 24 Mar 2026
15 points (77.8% liked)
Open Source
45715 readers
96 users here now
All about open source! Feel free to ask questions, and share news, and interesting stuff!
Useful Links
- Open Source Initiative
- Free Software Foundation
- Electronic Frontier Foundation
- Software Freedom Conservancy
- It's FOSS
- Android FOSS Apps Megathread
Rules
- Posts must be relevant to the open source ideology
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
- !libre_culture@lemmy.ml
- !libre_software@lemmy.ml
- !libre_hardware@lemmy.ml
- !linux@lemmy.ml
- !technology@lemmy.ml
Community icon from opensource.org, but we are not affiliated with them.
founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Dont worry that's not gonna happen. A collegue of mine use llms regularly for code reviews, but you have to put in the work to understand whether what they say makes sense and filter out false positives which are not occasional.
Exactly. Llms can output code, but they don't understand long-term intent. As llm usage grows, engineers who truly understand the system become more valuable, not less. Knowing which changes are safe and why decisions were made is now an even more critical skill. It's the backbone of any resilient system.