My company got me a license and there is a clear push to start using it for more mundane tasks (initial code review, migrations and so on). I use it whenever I think it will be faster but it rarely is. In personal projects I used it for some boring tasks like migrating scripts and it's definitely faster than learning completely new tools but it sucks not to understand the code you're using. Also, I know I would do it better myself (just 10x slower). I might use it for some other personal apps which are kind of 'fire and forget' tools, not something I'm planning on maintaining.
Programming
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
I mostly don't use AI... At least not directly for programming. I use it for other things like translating, formatting text, etc. i sometimes ask AI to make something for prototyping purposes.
I will occasionally ask AI to solve programming problems, more to keep up with current trends. I like to keep informed with what AI can and cannot do because even if I choose not to use them, the same will not be true with my coworkers or other people I interact with. Having a good understanding of the current "meta" for AI lets me know what to look out for in the context of avoiding disasters.
Tried using it, did not work out for my style of work, and otherwise would not like to rely on it too much.
We onboarded our team with VS integrated Copilot.
I regularly use inline suggestions. I sometimes use the suggestions that go beyond what VS suggested before Copilot license.. I am regularly annoyed at the suggestions moving off code, even greyed out sometimes being ambiguous with grey text like comma and semicolon, and control conflicting with basic cursor navigation (CTRL+Right arrow)
I am very selective about where I use Copilot. Even for simple systematic changes, I often prefer my own editing, quick actions, or multi cursor, because they are deterministic and don't require a focused review that takes the same amount of time but with worse mental effect.
Probably more than my IDE "AI", I use AI search to get information. I have the knowledge to assess results, and know when to check sources anyway, in addition, or instead.
My biggest issue with our AI is in the code some of my colleagues produce and give me for review, and that I don't/can't know how much they themselves thought about the issues and solution at hand. A lack of description, or worse, AI generated summaries, are an issue in relation to that.
Not using it, not gonna use it. I prefer my skills to be improving, not growing reliant on a glorified "smart" copy-paste.
...Isn't most AI based on theft?
Wait, you guys still haven't tried cocaine?
Using it you can work with much more energy and focus! You don't get tired either!
And it's so popular! It must be good!
Zero use. No need. Been doing this for 15+ years. Plus, if I don't know how to do something I kind of want the mental reward from figuring it out myself.
I use it daily. Nonstop.
I’ve been a dev for 40 years. This tech is incredible and enables me to create at an unthinkable pace.
I never used them. AI is shit, and they're still at the "burning money" stage, wait 2-3 years and they'll enter the enshittification stage, where it will be even worse.
Plenty of times I've seen coworkers stuck at the same problem for hours. Until they come and ask for help and I give them a simple answer for their simple problem. Every time it is "well, I asked the AI and it said this thing and it didn't work, so I asked it to fix it and it didn't either, a bunch of times.". I just tell them "you're surrounded by a lot of people here that know a lot about programming, why don't you ask any of them?".
For real, why use an AI at work where you are surrounded by people that can actually answer your question? It just makes no sense. Leave AI to those that can't pay an artist for their game. Or to those that have a "game design idea that will change the world" but won't pay a programmer even if they can't program themselves.
The great prof. Edsger Dijkstra explained much better than I ever could myself why trying to program computers using a natural language is a truly poor idea, in this now classic essay of his from 1978, On the foolishness of "natural language programming":
https://www.cs.utexas.edu/~EWD/transcriptions/EWD06xx/EWD667.html
Doesn't really do anything for me. It doesn't feel to me like it has changed all that much.
Sometimes I use it to translate to and from Japanese and English, does that count?
I don't. Personally, I don't believe that AI assisted coding has a future, and I don't like the quality of what it produces. Either we achieve AGI or whatever the finance bros are hyping up this week, or we don't. If we do, then AI can write code without having a human in the loop, so "vibe coding" is dead. If we don't, then AI code stays where its at right now, which is garbage quality. After a few years vibe coding disasters, demand for human coding will increase, and my skills will be much more valuable than even before all this craziness. In that case, letting those skills atrophy today would be the wrong move.
And if I'm wrong? Well, if AI code generation makes it so anyone who doesn't know how to code can do it... then surely I'll be able to figure it out? My existing skills wouldn't hurt.
Online AI might crash, burn and go away
But open weight local models are here to stay and not going anywhere. We’re not going back to pure intellisense and simple tab completes
My company GitHub has copilot do code reviews. That is the extent of my AI use. I've had to correct copypasta from AI agents that coworkers used that was just wrong.
I don't, and probably never will. A whole bunch of reasons:
- The current state of affairs isn't going to last forever; at some point the fact that nobody's making money with this is going to catch up, a lot of companies providing these services are going to disappear and what remains will become prohibitively expensive, so it's foolish to risk becoming dependent on them.
- If I had to explain things in natural language all the time, I would become useless for the day before lunch. I'm a programmer, not a consultant.
- I think even the IntelliSense in recent versions of Visual Studio is sometimes too smart for its own good, making careless mistakes more likely. AI would turn that up to 11.
- I have little confidence that people, including myself, would actually review the generated code as thoroughly as they should.
- Maintaining other people's code takes a lot more effort than code you wrote yourself. It's inevitable that you end up having to maintain something someone else wrote, but why would you want all the code you maintain to be that?
- The use-cases that people generally agree upon AI is good at, like boilerplate and setting up projects, are all things that can be done quickly without relying on an inherently unreliable system.
- Programming is entirely too fun to leave to computers. To begin with, most of your time isn't even spent on writing code, I don't really get the psychology of denying yourself the catharsis of writing the code yourself after coming up with a solution.
You wrote this all a lot better than I could have, but to expand on 2) I have no desire whatsoever to have a "conversation" (nay, argument) with a machine to try and convince/coerce/deceive/brow-beat (delete as appropriate) it into maybe doing what I wanted.
I don't want to deal with this grotesque "tee hee, oopsie" personality that every company seems to have bestowed on these awful things when things go awry, I don't want its "suggestions". I code, computer does. End of transaction.
People can call me a luddite at this point and I'll wear that badge with pride. I'll still be here, understanding my data and processes and writing code to work with them, long after (as you say) you've been priced out of these tools.
I use it as an overconfident rubber duck to bounce ideas and solutions off of in my code, but I don't let it write for me. I don't want the skills I've practiced to atrophy
I don't, it's not better than simply thinking about things myself. There isn't institutional pressure to use it and if there was I would simply lie and not use it.
Yeah. I prefer not externalizing my ability to think.
More like a manual. Google has become really shitty for complex queries, LLMs can find relevant keywords, documents much more realiably. Granted, if you are asking questions about niche libraries it hallucinates functions quite often so I never ask it to write full pieces of code but just use it more like a stepping stone.
I find it amusing how shamelessly it lies about its hallucinations though. When I point out that a certain function it makes up does not exist the answer is always sth of the form "Sorry you are right that function existed before version X / that function existed in some of the online documentation" etc lol. It is like a halluception. If you ask it to find some links regarding these old versions or documentations they also somehow don't exist anymore.
I wonder if you need to explicitly prompt it to check if a function really exists before suggesting it? Think about how a human brain works... we are constantly evaluating whether or not things are really true based on info in our heads... but we are not telling the models to do the same thing and instead they just yolo some shit that is confidently-wrong (not unlike many humans, admittedly).
Yeah I am not a very efficient user that is for sure, I just ask my question as if typing to a search engine. If one uses a specification document like more serious users do, it probably becomes more accurate.
Please, continue to "use AI daily". Rot your brain, see if I care.
If my competitors want to shoot themselves in the foot that's fine by me, I won't stop them.
I don't and never will. I'm one of the only people at my ~450 person company that doesn't use an LLM.
Do you believe you will still be working in said company in a year without using LLM?
Are your skills so specific you’re irreplaceable? Or are you as productive without extra tools as your coworkers with them?
Why?
Never used it to write my code. Others have given great reasons, which resonate with me, but the biggest one for me is that I enjoy writing code and designing programs. Why would I outsource one of the things I love to do? It’s really that simple for me.
I am simply not interested. I enjoy writing code. Writing prompts is another task entirely.
I imagine that one day I might ask an ai to teach me how something works, but not to write the code for me. Today, I sometimes have to slog through poorly written documentation, or off topic stack exchange posts to figure something out. It might be easier using an llm for that I guess.
I imagine that if I only cared about getting something working as fast as possible I might use one some day.
But today is not that day.
I have stopped using it, because the skill atrophy kicked in and I don't want to turn into someone chatting with a bot every day.
I work as a software developer and over the last months, I slipped into a habit of letting ChatGPT write more and more code for me. It’s just so easy to do! Write a function here, do some documentation there, do all of the boilerplate for me, set up some pre-commit hooks, …
Two weeks ago I deleted my OpenAI account and forced myself to write all code without LLMs, just as I did before. Because there is one very real problem of excessive AI useage in software development: Skill atrophy.
I was actively losing knowledge. Sometimes I had to look up the easiest things (like builtin Javascript functions) I was definitely able to work with off the top of my head just a year ago. I turned away from being an actual developer to someone chatting with a machine. I slowly lost the fun in coding, because I outsourced the problem solving aspects that gave me a dopamine boost to the AI. I basically became a glorified copypaster.
Web dev here, never used it :) I like to think for myself
Of course.
My reasons for not using AI are the same as they were four months ago and will be the same in four months, regardless of what the models can or can't do.
Ask again in four years.
But in those 4 months AIs evolved a lot
Has it really? I don't feel like it's much different for programming compared to 4 months ago.
I use it at work because my colleagues only use it so it's the only way I can deal with the LLM slop without total killing myself. And it's horrendously bad still. I fucking hate it. Makes the worst fucking decisions.
I'm considering a career change honestly. I can't stand this shit anymore.
Never have, never will. I’ll quit the industry before I use AI to do my job.
I only use it when I'm learning something very new and very dense and that's to stand up an example based on a context I'm interested in or already familiar.
It helps me identify parts of the docs to focus on more quickly.
Otherwise no, I'm getting better without tools
Never used it. Don't see any reason to. I just type stuff in my IDE. Works like a charm.
Most of the time I'm not writing large volumes or boilerplate code or anything, I'm making precise changes to solve specific problems. I doubt there's any LLM that can do that more effectively than a programmer with real knowledge of the code base and application domain.
I also work on open source software and we haven't seen a meaningful uptick in good contributions due to AI over the last few years. So if there's some mythical productivity increase happening, I'm just not seeing it.
I don't use it… every time I've tried it seems more trouble than help
It's like working with a suck up newbie that can't learn
My team just recently started using copilot for PR reviews.
So far I've found that 90% of what it raises is incorrect. For the stuff it actually finds, the code suggestion to fix it is almost always wrong. It will write 20 lines of code for something that's a one-line fix.
It picked up on one reentry bug on a recursive function that I don't think another dev would have spotted.
It's definitely slowing me down. I hear about AI wasting dev's time with bogus bug reports. It's now integrated into my workflow.
I've already got SonarQube and linters which finds issues the moment I introduce them. They're doing a much better job at maintaining code quality.
Having spent much of my software engineering career training and mentoring interns, new-hires, and transfers from other departments, and having toiled with some of their truly inexplicable questions that reveal shaky technical foundations, I can understand why so-called AI would be appealing: inexhaustible, while commanding the full battery of information stores that I could throw at it.
And yet, the reason I don't use AI is precisely because those very interns, new-hires, and transfers invariably become first-class engineers that I have no problem referring to as my equals. It is my observation that I've become better at training these folks up with every passing year, and that means that if I were to instead spend my time using AI, I would lose out on even more talented soon-to-be colleagues.
I have only so much time of my mortal coil remaining, and if the dichotomy is between utilizing inordinate energy, memory, and compute for AI, or sharing my knowledge and skills to even just 2 people per year for the rest of my career, I'll happily choose the latter. In both circumstances, I will never own the product of their labor, and I don't really care to. What matters to me is that value is being created, and I know there is value in bringing up new software engineers into this field. Whereas the value of AI pales in comparison, if it's even a positive value at all.
If nothing else, the advent of AI has caused me to redouble my efforts, to level-up more engineers to the best of my ability. It is a human legacy that I can contribute to, and I intend to.
I don’t. But my boss is pressuring me to do so because his boss and above are asking for all engineers to use it. The most I use it for is the occasional bash scripting semantics. I prefer not to use it in my daily work.
I recently had to tell one of my juniors to turn off his AI tools. His code was just all over the place and difficult to review. He still has a lot to learn, but I've already seen an improvement now that he actually has to be a bit thoughtful.
Nope, still not using AI and hopefully never will. Writing code is actually pretty fun so why would I want to outsource an AI to do that?