this post was submitted on 13 Dec 2025
84 points (85.6% liked)
Programming
24072 readers
119 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't think we're using LLM'S in the same way?
As I've stated several times elsewhere in this thread, I more often than not get excellent results, with little to no hallucinations. As a matter of fact, I can't even remember the last time it happened when programming.
Also, they way I work, no one could ever tell that I used an LLM to create the code.
That leaves us your point #4, and what the fuck? Why do upper management always seem to be so utterly incompetent and without a clue when it comes to tech? LLM'S are tools, not a complete solution.
AI can only generate the world's most average quality code. That's what it does. It repeats what it has seen enough times.
Anyone who is really never correcting the AI is producing below average code. (Edit: Or expertly guiding it, as you pointed out elsewhere in the thread.)
I mean, I get paid either way. But mixing all of the worlds code into a thoughtless AI slurry isn't actually making any progress. In the long term, a code base with enough uncorrected AI input will become unmaintainable.
In my case it does hallucinate regularly. It makes up functions that don't exist in that library but exists in similar libraries. So the end result is useful as a keyword though the code is not. My favourite part is if you point out that the function does not exists the answer is ALWAYS "I am sorry you are right, since version bla of this library this function no longer exists" whereas in reality it had never existed in that library at all. For me the best use case for LLMs is as a search engine and that is because of the shitty state most current search engines are in.
Maybe LLMs can be fine tuned to do the grinding aspects of coding (like boiler plates for test suites etc), with human supervision. But this will many times end up being a situation where junior coders are fired/no longer hired and senior coders are expected to baby sit LLMs to do those jobs. This is not entirely different from supervising junior coders except it is probably more soul destroying. But the biggest flaw in this design is it assumes LLMs one day will be good enough to do senior coding tasks so that when senior coders also retire*, LLMs take their place. If this LLM breakthrough is never realized and this trend of keeping low number of junior coders sticks, we will likey have a programmer crisis in future.
*: I say retire but for many CEOs, it is their wet dream to be able to let go all coders and have LLMs do all the tasks