this post was submitted on 20 Dec 2025
68 points (90.5% liked)

Ask Lemmy

36180 readers
2127 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] solomonschuler@lemmy.zip 7 points 2 days ago

That AI (as in "generative AI") helps in learning if you give it the right prompt. There is evidence to support that when a user asks AI to implement code, that they (the user) won't touch it because they are unfamiliar of the code it generated. The AI effectively made a psychological black box that no programmer wants to touch even for a (relatively speaking) small snippet of code to a larger program, that was programmed by another programmer or him.

To further generalize, I fully believe AI doesn't improve the learning process, it makes it more accessible and easier for less literate people in a field to understand. I can explain Taylor expansions and power series simplistically to my brother who is less literate and familiar with math. I would be shocked that after a brief general overview he can now approximate any function or differential equation.

Same applies with chatGPT: You can ask it to explain simplistically taylor and power series solutions, or better yet, approximate a differential equation, it doesn't change the fact that you still can't replicate it. I know I'm talking about an extreme case where the person trying to learn Taylor expansions has no prior experience with math, but it still won't even work for someone who does...

I want to pose a simple thought experiment of my experience using AI on say (for example) taylor expansions. Lets assume i wants to learn Taylor expansion, ive already done differential calculus (the main requirement for taylor expansions) and I asks chatGPT "how to do Taylor expansions" as in what is the proof to the general series expansion, and show an example of applying Taylor expansions to a function. What happens when I try and do a problem is when I experience a level of uncertainty in my ability to actually perform it, and this is when I ask chatGPT if i did it correct or not. But you sort of see what I'm saying it's a downward spiral of loosing your certainty, sanity, and time commitment over time when you do use it.

That is what the programmers are experiencing, it's not that they don't want to touch it because they are unfamiliar with the code that the AI generated, it's that they are uncertain in their own ability to fix an issue as they may fuck it up even more. People are terrified of the concept of failure and fucking shit up, and by using AI they "solve" that issue of theirs even though the probability of it hallucinating is higher then if someone spent time figuring out any conflicts themselves.