this post was submitted on 11 Mar 2026
27 points (93.5% liked)
Hacker News
4508 readers
434 users here now
Posts from the RSS Feed of HackerNews.
The feed sometimes contains ads and posts that have been removed by the mod team at HN.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If this kind of jealousy exists, it's pretty nonsensical.
Machines are made in factories, using materials and machines that no one of us Software Engineers could afford.
The only Digital IC I would be able to make by hand are the little handheld-sized AND gate, OR gate, Flip-Flop etc, things (which will still end up requiring more than 1 person and a significant amount of investment) which are far from being able to run software from even generations ago.
And I don't need to write software to make those things to do my bidding. I can break down the task, create the logic and build it from the ground-up using those gates. Because the logic itself (and not really the language) is the value I create.
And that will always take time.
Also, automating that, doesn't require an LLM. Once I can make a machine to do a thing, I can make a machine that can make the previous machine, simply because I have a way to make a device that is logically sound and consistent.
People put all different kinds of logic into the same umbrella of "AI" and act as if they have the same value, but they don't.
When you go with stuff like ML and Computer Vision, you might be shifting from an anvil to a hydraulic press, but when you start using LLMs for stuff other than language, expecting it to do logic and hoping that it won't make a mistake somewhere you are not looking, is far from that.
With a hydraulic press, you know what piece you can work on and upto what level after which you switch to precision tools. But when you use a hammer to do a screwdriver's job it may look just fine and may work where you are looking, but will then end up failing in ways you don't know of, because you didn't realise it was a hammer and didn't take a good look at the screw.
This part is correctly said. What is lacking over here is that we are using the wrong tool for the wrong job and the price we pay for it, is going towards reducing our ability to use the created software for ourselves.
The expectations for quantity increase.
The baseline quantity output rises.
The maximum possible quality falls.
The expectations for quality decrease (as if they weren't already low enough).
The ability to understand your product, vanishes.
And everyone ends up calling it "magic" and you a "chanter".
Tools of automation reduce unnecessary variations.
Deploying automation with deterministic devices en-masse, can help reduce variation and bring up the bottom line, with the trade-off of maximum quality.
When you start creating the automation from a non-deterministic point and use it to feed input into a deterministic GIGO, then in turn for the trade-off of maximum quality, you get 'GO'.
Any engineer worth their salt can do proper logic. And most humans can learn a programming language, just as any language *sometimes more easily).
But if an engineer not understanding a language now gives their logic to an LLM that writes the wrong logic in said language, what is telling the Software Engineer, that it was not the logic the engineer intended. And if the original engineer can tell that to the Software Engineer while checking the code, they can do so for writing the code.