It's not really surprising, and there are already quite a few studies on cognitive decline related to AI usage out there. I guess the wide scale effects will only be visible in a few decades, but I suspect it will look a lot like Idiocracy.
Fuck AI
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
No shit. LLMs are the anti-thesis of education. The basis of which is hands on practice. Using a glorified autocomplete-my-assignments should be barred from education without exception.
This is the tip of the iceberg. The world is facing a critical collapse of profession. We're nearing a point that was foretold ins sci-fi where humans no longer understand how the machine works.Just that they can use it and it works.
Except sci-fi was too optimistic. Science fiction has actual AGI. We have glorified autocomplete. A non-intelligence.
A huge part is the tools they're using to "detect" AI use. My sister in law fought with Grammarly, which she was required to pay for and use by the school, to prove her paper wasn't written by AI.
She spent twenty hours trying to "de-ai" a paper that wasn't written by AI. The only things that worked were using bad grammar and poor sentence structure. The class was pass/fail, and probably for that reason.
The bubble needs to pop yesterday
Having to add AI to the products I work on is destroying my ability to think.
During a conversation with my sister about going back to school to finish her electrical engineering degree she basically said this:
As an older student. What /I/ see is a bunch of students who don't really know how the workforce works, but who HAVE grasped these facts:
- The school wants you to have a social life. That's why there's all these events, and they cancel class for games and stuff sometimes. So you SHOULD be doing social stuff.
- It's a dog-eat-dog world, and the only way to get ahead is to use every tool at your disposal, including and especially GenAI.
- Everybody is doing it, especially the smart people, including the grad students, AND sometimes even the teachers. Even the professionals are like "this is how you can use ChatGPT!"
- So, to get everything done and have A Good College Experience, obviously you just use ChatGPT! Why spend hours in the library like all those losers?????
And, then, on the other hand, I see:
A bunch of very tired, overworked, overwhelmed teachers who are doing their damndest with students who can't do even half of the bare minimum that was expected when they went through college.
Bending over backwards and then back upwards to pass these kids, by hook or by crook.
Like, people giving 100s of points of extra credit.
People putting together a whole website of specific terms for each unit AND the slide shows from the classes, quizzes that are pass/fail - as in, you get the credit if you take the quiz on time, period, regardless of what you get, so you can use it to review, AND the weekly essays are either:
- 750 word summary of the reading + answer 4 questions (no word limit) + pose two questions about the text, no specifc formatting required.
OR
- 250 word reflection: how does what we learned apply to your life? No specific format required.
I was writing 5 page research papers for classes my first year of college. What the fuck. What the fuck.
(She went back to school to finish her degree)
You're telling me you can't write a 750 word SUMMARY that doesn't need to be formatted beyond "put your name on it and use punctuation"?????????
The classes themselves are not hard. IF you have a good grounding in logic and math and writing. Which these people DO NOT.
They have a good grounding in letting ChatGPT do the work for them and letting the teacher tell them how to do everything.
And goddamn I might get 77s because I can't do algebra and I can't do math fast enough to finish a test.
But at least I'm not failing because I can't THINK.
She also mentioned that a lot of professors are kind of trying to walk the thin line between failing students (who will then go to places like "ratemyprofessor.com and leave what essentially amount to bad reviews which can threaten their employment), and passing students who aren't actually grasping the basics and I think social media is just compounding the problem because of that.
Imagine working in fast food and already getting complaints all the time and then having to worry about someone putting you on a rate my server website where they trash talk you and you have no recourse to have that information taken down.
At least with yelp it's not first and last names and it's the business that takes the flak.
Then fail them. Oh, that would look bad? The "system "won't allow it? Remind me, what's your fucking job again?
It's absolutely terrifying. I am a returning student to uni in my thirties and the only person not using any AI. They literally depend on it.
I just had a classmate the other day turn to me, frustrated, saying "You ever ask chat(gpt) a question and it gives you a whole, like, paragraph you then have to read? like, why can't it simplify it?"
Did I mention I am an electrical/computer engineering double major? So yeah, even reading is too much for these kids. Future workforce is fucking cooked.
Oh yes! We are fucked.

I’d say that social media platforms opened that door a long time a go. A bunch of people telling others what’s cool, what “truth” is being hidden from you, click here for the “fact” you won’t believe, how to do anything but actually think critically about what you’re presented with. People search social media for answers before they look at wikipedia, forget considering a summary from actual scientific sites.
Coupled with algorithms pushing tailored engagement bait over boring facts and AI just wraps it all up in a top search result shoehorned above 10 “Sponsored” links and SEO garbage, there’s no room for boring old critical thought and truth anymore.
I'm sympathetic to the use of generative AIs insofar as I want society to advance to something very close to Federation society in Star Trek.
Three or four centuries from now, humanity has faced several brutal wars and has come together finally to establish a utopia. The society is moneyless and classless, and the state as much as it exists is a mechanism for mutual aid to other aligned planets. I won't go into too much lore but this is allowed because the matter replicator eliminates scarcity; almost any material good including food can be created out of essentially nothing.
What that means for society is that, as a member of the Federation, you can do literally anything you want with your life. Want to study art and become a sculptor? Excellent, you don't need to pay rent or buy food so you can do that to your heart's content. You don't even need to be good at it. Want to be a farmer on a vineyard in France and make excellent wine? Great. Want to be a farmer on a vineyard that grows grapes to make raisins? Again, you do you. Run a Cajun restaurant in New Orleans if you want; you don't have to buy ingredients and your customers don't have to pay, so the only people that do anything are the people who actually want to be there.
Back to the real world. Almost everyone in university is there because they need a job to receive money in order to pay for the things they need to survive, and the job requirements say "this degree is required". Plenty of people become engineers, for example, because they know the pay will be good, not because they have an intrinsic desire to be engineers. Plenty of people who get engineering degrees stay in engineering roles they hate (or who would do other things if the pay was as good) for the same reason, even though plenty of engineering jobs might only require 2% of what was learned during the course of the degree, if that. The degree is a bureaucratic requirement for survival in a capitalist system and not an actual measure of ability, skill, or desire. That same capitalist system will discard them the minute they stop being useful. This doesn't exactly breed contentment and a willingness to contribute to the greater good of humanity like we see in Star Trek.
Using AI to pass these arbitrary requirements needed to exist in an arbitrary society I find to be understandable. If we want people to go to school for the sake of learning, to better themselves as human beings, that would be a different situation altogether, but that is no where near the system we have actually set up.
man, that's a whole lot of fantasy wrapped up around a tool that's propping up fascist oligarchs that are driving the world into a forced plutocracy where we're all enslaved to work for "the company".

I thought the backstory was important to help to cross all my T's and dot all my... lower case J's
The kids who were 12 when the pandemic happened are now 18 and will be having their own kids soon.
Fuck im old.
I've heard much of the same from my friends who teach middle and high schoolers: most alarmingly that they can put information up on the board, ask a question about it, and the students don't even connect that the answer is already in front of their eyes.
And sadly, a very common question they get is: "If AI can do this for me, why do I need to learn it in the first place?"
The worst part is that, in the short-term, the only recourse people have is suing social media and LLM companies, who are awash in cash and happy to settle, or throw their weight behind age verification, which in its various forms poses a security risk. Parents, clearly, are parking their kids in front of screens and unwilling to parent, so that's not something you can depend upon.
I'm just glad I never procreated, but this problem is going to affect us all when these kids try to enter the work force and can't actually do anything.
Reminds me a lot of teachers lying that we wouldn't have calculators with us at all times.
Valid point: One needs to know how arithmetic works in order to get a computer/calculator to do it for you. This is fair; I use CAD software to design furniture, I do it parametrically, I have to solve problems like "if the overall width of the table top is 24 inches, the top overhangs by two inches all the way around, and the legs are an inch and a half wide, how long does the apron board between the legs need to be? And how long do I cut the board to add 3/4" long tenons?" I have to keep order of operations in mind there. But I write the expression and allow the computer to solve it.
Also valid point: I haven't once done long division since middle school, because guess what? I have machines for that. I have had use for the concept of quotients and remainders...but I had to learn for myself how to get computers to calculate them using modulo operators. 5 / 2 = 2, 5 % 2 = 1. five divided by two is two remainder one. The algorithm of drawing the sideways L and putting one number under it and the other number to the left of it and then doing long division is not something I needed an entire semester of practice doing. You can't convince me that was designed for my benefit, that was designed to keep me quiet.
Modern school is framed largely as a series of assignments one needs to get good grades in in order to be allowed to do something. A high school diploma is required to...practically be a citizen. "You have to get a good grade on this essay because it's required for you to pass this class, which is required for you to graduate and get your diploma that we are legally required to force you to get."
Children aren't stupid, they know what bullshit is and they don't like having their time wasted anymore than adults do. Children as a demographic have dozens of millennia of experience growing up into adults, they've been playing house and playing job since the invention of houses and jobs, they played cave and played hunt before that. They can feel when school isn't like house or job and won't help them do house or job. And it's gotten to the point where that describes most of school, because they focus more on the difficulty of a class or test or curriculum than its usefulness.
There's a lot more to life than house and job and learning how to learn is the most important thing you can get from grade school.
This is a consequence that the Epstein class will welcome and invest more heavily in. And the rest of us will have to deal with literal idiots for the rest of our lives.
I would like to think if I were a student, that I would use AI to explain things to me that I had too much anxiety to ask. I'd probably to use it to do my homework but to mainly check that I did it right? I'd "like" to think that but I wonder how the temptation would get me to just shortcut it ...?
I'm in software and we have to use it for work. There's some toxic positivity going around with it and everyone is just saying how they don't write code any more and how proud they are of that. They even turn their nose at me because I don't use it as much because I'm lower on the Cursor dashboards than them. I like using AI to explain things to me but I'm mainly a senior so I don't write as much code, I actually spend more time reading and it's far more arduous. We actually have juniors turning into prompt monkeys were they just put everything into AI and if you ask them why, they just put your question into AI and giving it back to you. To be clear, that's been happening to other seniors that work with them. If it was me, I'd rip them apart and tell them I can find a quick way to cut cost in the company if they don't wanna learn anything.
… they just put everything into AI and if you ask them why, they just put your question into AI and giving it back to you.
This must be what it's like for your planet to be taken over by an alien hive mind.
I refuse to believe that the Pluribus series from Vince Gilligan is not actually about AI. There are so many scenes in it that could be adopted to people overusing LLMs.
I'd probably to use it to do my homework but to mainly check that I did it right.
This is where I’ve found it useful. Or I’ll say to not give me the answer but to show me the method I should follow for a problem I don’t know how to solve.
The problem is that this doesn't work if you don't know the subject matter already. You can't tell the facts from the inevitable hallucinations.
A possible mitigation might be to ask the same question in different phrasings multiple times, and ideally ask different models, to distill the truth that way. But I'm not sure you still save any time, that might actually take longer than just learning the old-fashioned way. Plus you don't learn how to learn, so it's a net negative.
What if you are not capable of learning the material on your own? Then it’s just tough luck.
So, it's all going to plan then?
So much ai slop in the professional world too
I got my first obviously AI email from a boss recently. The tone didn't match his normal cadence of writing, it was sterile, repetitive, and could have really been summed up as, "Do you have additional information about item X that can help explain this to our customer?"
It was three paragraphs long.
On Thursday there was a guy in my history class who was using ChatGPT the entire time.
We were watching a documentary and supposed to take notes on a physical sheet of paper. Some of the boxes on that sheet of paper only needed like 4 or 5 words to get the gist of it.