this post was submitted on 16 May 2025
647 points (98.4% liked)

Technology

70528 readers
3945 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] MagicShel@lemmy.zip 187 points 2 weeks ago (6 children)

For fucks sake people, it's not hard. AI can be useful to generate drafts or give suggestions, but ultimately everything has to be tweaked/written by an actual human expert. AI is a tool, not a product. If something isn't edited enough to have no trace of AI signature left, then you're being lazy and putting out garbage.

[–] Jhex@lemmy.world 50 points 2 weeks ago

it's "hard" because every peddler of AI is pushing it exactly in the way you say, and I agree, is wrong

[–] neon_nova@lemmy.dbzer0.com 13 points 2 weeks ago

This is it exactly. I use ChatGPT to double check things when I’m second guessing myself and I use it to make assignments.

Almost everytime, I need to tweak things but it turns 40 minutes of work into 5-10 minutes.

[–] dream_weasel@sh.itjust.works 4 points 2 weeks ago* (last edited 2 weeks ago) (5 children)

"no trace" isn't a necessary bar. You can learn business theory from a presentation including a 3 armed gargoyle without loss of information. Materials just need to be checked to be factual, which this seems to meet.

Mr Business Professor is probably one of the highest paid instructors in the college and his time is NOT well spent cruising the internet for PowerPoint images or formatting lecture materials. Frankly that's not a good use of TA time either.

[–] MagicShel@lemmy.zip 5 points 2 weeks ago

Sooner or later I'll learn to caveat my AI comments to make clear I'm only talking about LLM/text-gen. I don't personally care about image-gen. It's garbage, but to quote Star Wars, sometimes "the garbage will do."

load more comments (4 replies)
[–] finitebanjo@lemmy.world 4 points 2 weeks ago

It's a tool whose only purpose is to lie and generate bullshit. We're right to be upset when we find out we paid a human expert top fucking dollar to give us bullshit.

[–] Opinionhaver@feddit.uk 2 points 2 weeks ago (3 children)

Time after time, I see people who should know better fail at basic things like this.

Even I don’t get called out for AI-written responses, even though a big number of my messages here are technically written by AI. The key difference is that I actually take the time to write a first draft of what I want to say, then run it through ChatGPT to help clean up my word salad - and finally, I go over the output again to make it sound like me. The thinking is mine. AI just helps me communicate more clearly.

I’d never ask it to write an entire response from scratch without providing structure or points I want to make. All I want is for the person reading my message to understand what I’m actually trying to say - so they can respond to that, not to a misinterpretation of what I was trying to say.

I'll just leave that first draft here to illustrate my point:

Time after time I see people that should know better to fail at basic things like this.

Even I don't get called out for AI responses even though a huge number of my messages posted here are technically written by AI. However, the difference here is that I actually took time to first write the first draft of what I want to say only then to give it for chatGPT to make sense of my word salad only for me to then go over it's output to make it sound like me again. The thinking is done by me - AI only helps me to communicate more clearly. I'd never ask it to write the entire response from ground up without providing any structure and points about what I want to say. All I want is the person reading this message to get as clear of an understanding as possible of what I'm trying to say so that they can respond to that rather than to misintrepretation of what I was trying to say.

[–] Deebster@infosec.pub 3 points 2 weeks ago

This is a great use of AI and it's caught some small errors like the wrong its (which is one I find distracting when reading). The editing is light enough that it's still your voice, just with extra punctuation and fewer typos.

load more comments (2 replies)
[–] blazeknave@lemmy.world 1 points 2 weeks ago

I think there's a sweet spot you can hit, but sometimes I fight with it so long to get what I want, that by them, copy paste whatever is good enough... To be fair, I'm not an educator

[–] 11111one11111@lemmy.world 99 points 2 weeks ago (2 children)

"He's telling us not to use it, and then he's using it himself,"

Yeah it sucks but there is zero chance this argument holds any weight in court.

[–] BlameTheAntifa@lemmy.world 30 points 2 weeks ago (1 children)

The problem is that a student has a reasonable expectation of being taught by someone qualified and knowledgeable in the topic. If the professor is using AI, then that is a major breach of trust that brings into question the professor’s qualifications and whether you are actually getting the education you are paying for.

[–] PattyMcB@lemmy.world 6 points 2 weeks ago

Not to mention the risk of plagiarism

[–] thann@lemmy.dbzer0.com 25 points 2 weeks ago (9 children)

Yeah, I had teachers change the rubric on the day of the final and even after and the deans at UCSB didnt care at all. Teachers can do just about anything under the guise of education...

load more comments (9 replies)
[–] mhague@lemmy.world 53 points 2 weeks ago (1 children)

"He's telling us not to use it, and then he's using it himself"

Just because the teacher might have screwed up doesn't change that experts in a subject can assess LLM output, while a student who knows jack shit about the topic can't. Just because the teacher messed up and let ai weirdness degrade the quality of education in the eyes of students, doesn't mean just anyone can use chatgpt to generate college courses.

I read the original article but not the interview. I wonder how much communication there was about the work before the student decided they deserved a refund.

load more comments (1 replies)
[–] arafatknee@lemmy.dbzer0.com 52 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

LLMs should augment your skills not substitute them. That's just laziness or incompetence.

Or worst case scenario it means your job is replaceable.

[–] FlashMobOfOne@lemmy.world 4 points 2 weeks ago

Agreed.

I'd be pissed too. This is sloppy as hell.

[–] Loduz_247@lemmy.world 50 points 2 weeks ago (3 children)

If I see a representative or senator using ChatGPT, could I demand that he resign from his position?

[–] TheGoldenGod@lemmy.world 46 points 2 weeks ago (1 children)

You could, but you would be pissing in the wind asking.

[–] Venator@lemmy.nz 2 points 2 weeks ago (1 children)

Isn't the phrase meant to be "pissing into the wind"?

[–] huquad@lemmy.ml 4 points 2 weeks ago

I'm already asking, no one listens to the people anyway

load more comments (1 replies)
[–] miridius@lemmy.world 40 points 2 weeks ago (3 children)

Holy crap... slashdot still exists??

[–] skooma_king@lemm.ee 27 points 2 weeks ago (3 children)

It’s still the same 10 users complaining about systemd

[–] DragonTypeWyvern@midwest.social 3 points 2 weeks ago

Never forget, never forgive

load more comments (2 replies)
[–] Duamerthrax@lemmy.world 7 points 2 weeks ago

I checked it out again when reddit did their api fuckery. The progressives left and the libertarians are all that remain. I didn't stick around long enough to get a better feel for the situation.

[–] FriendBesto@lemmy.ml 4 points 2 weeks ago

Apparently, Digg may be coming back.

[–] lefixxx@lemmy.world 26 points 2 weeks ago (1 children)

Why do people not review their LLMs output?

[–] Vanilla_PuddinFudge@infosec.pub 39 points 2 weeks ago (1 children)

Well of course I do.

...Gemini, review this ChatGPT paragraph

[–] vivendi@programming.dev 5 points 2 weeks ago

This is unironically a technique for catching LLM errors and also for speeding up generation.

For example in speculative decoding or mixture of experts architectures these kind of setups are used.

[–] HexadecimalSky@lemmy.world 20 points 2 weeks ago (2 children)

I had professor usi g worksheets watermarked by another professor at a college in another state, y'all think anything came of it? He also gave us all the answers to the tests in the form of self graded quizes and let us take them into tests.

HS diplomas became a joke, degrees are becoming a joke...

[–] ToastedRavioli@midwest.social 20 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

As someone who was a TA a bit, I think that is 99% because if schools tried to hold students accountable to the standards of even ten years ago they would have to fail 2/3rds of their students.

Highschool becoming a joke means none of the kids have strong enough core skills to be tackling real college work by the time they get there, but schools cant afford to enforce actual quality standards for work. The graded model has completely fallen apart at this point given how steep the curve is. The quality of work that gets an A today would have been a B or high C from 10-15 years ago. Of course there is real A grade work being done too, but what defines an A grade has ballooned to a ridiculous degree such that most of it is not really A grade work

The problem isnt new, it was already bad 10 years ago to be honest. I had a professor in community college about 10 years ago who had been a professor at ASU, and she had quit teaching there specifically because the university wouldnt allow anyone to be graded below a C, regardless of if they did any work or not.

Most large public universities are just degree mills at this point, or bordering on it if not

[–] Alaik@lemmy.zip 3 points 2 weeks ago (1 children)

You say that but I've had two classes this semester with an 70%+ fail rate. One of them probably needs addressed in the sense the professor was ass, but one was just straight up hard. They gave no fucks about failing over half the class. The pre-req for that class also had a 60% failure rate (based on who I see repeating it).

I don't doubt some universities are degree mills, and ASU has always been known as a party school, but I assure you it's not as widespread as some would believe based on my experiences at 3 universities.

That being said, the quality of student certainly seems to have dropped.

[–] HexadecimalSky@lemmy.world 1 points 2 weeks ago

yeah, which is why where you get your degree sometimes matters but with all the standization, strictly speaking a AAEE-T from a shitty school counts just as much as any other school, even if the quality isn't there.

[–] dream_weasel@sh.itjust.works 2 points 2 weeks ago (1 children)

Was your grade exclusively based on the tests and quizzes? That's the only part that's questionable here.

Every prof need not write their own unique materials, rather they need to teach you the target material using whatever resources best suit that purpose.

[–] HexadecimalSky@lemmy.world 1 points 2 weeks ago

Most of the grade was on the tests, I think 70% then like 30% on the "Lab" portion....which was basically attedance and if you turned something in, he didn't check for accuracy. One of the worst professor at this college.

[–] finitebanjo@lemmy.world 18 points 2 weeks ago

A while back I saw a post on Lemmy accusing ASU, who is publicly partnered with OpenAI, of trying to quietly replace advisors with chatbots.

It's truly a dark time for USA higher education.

[–] Pheonixdown@lemm.ee 16 points 2 weeks ago (1 children)

I'm currently doing an online Master's with Northeastern. Honestly not surprised this happened, the quality of classes is WILD.

Taking 2 classes per term, and each term so far 1 class has been very well designed but also insanely easy, while the other has been so poorly implemented that the course learning materials don't actually help you do the coursework.

Probably most astonishing so far though is a course I'm taking now just served me with the literally exact same assignment that I did for a course I just finished. Now, granted that both classes are from the elective course choices, so not everyone will take both, but come on... and they grill me about plagiarism with every submission I make...

[–] FinalRemix@lemmy.world 2 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I reuse assignments between similar classes, because maybe those classes share a learning objective and that assignment is just gangbusters.

In cases where students take both (which, we actively discourage because of the similarity of courses), I have my team require the students, for example, use a different person as their subjects for the two assignments.

[–] Pheonixdown@lemm.ee 1 points 2 weeks ago

This assignment is literally a fill in the blanks to complete a set of code to make it produce values expected by the assignment.

[–] dream_weasel@sh.itjust.works 14 points 2 weeks ago (2 children)

As long as the materials are accurate and serve as an effective teaching aid, where's the case?

It would be different if the sum total of course materials were wikipedia articles presented by a non expert, but the professor IS an expert. Sure, anyone can use genAI, BUT not anyone can write a relevant, targeted prompt and check the accuracy of the output. This is of course assuming the professor is generating (or at least vetting) materials for accuracy.

IF it turns out the student can find a pattern of inaccurate content there is a case. Otherwise there's nothing: it would be like arguing that a TA made the materials (or the lecture materials came from a book written by SOMEONE ELSE gasp) and the professor presented them so the class is invalid.

[–] sugar_in_your_tea@sh.itjust.works 5 points 2 weeks ago* (last edited 2 weeks ago)

Exactly. Nobody should care how the professor generates materials for the class, they should only care that the materials are effective and accurate. That's the professor's job, and they should be free to use whatever tools they find helpful in producing effective, accurate materials.

Mistakes happen. I found a bunch of errors in my classes, and this was before AI was a thing. The information was accurate, but the presentation was poor.

[–] miridius@lemmy.world 2 points 2 weeks ago (1 children)

I think the key take away is that college is over rated, as you can easily find and create your own course materials on par with (or often better than) what the professors create

[–] dream_weasel@sh.itjust.works 2 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Lol no. You absolutely cannot.

You can maybe make it look nicer, but your high school diploma and street cred does not an education make.

The neat thing about it is, if you think this way, it would be impossible to prove to you that you can't do it yourself just as well. Without DOING it, you just don't know how much you don't know compared to a university faculty member. There are people who can go to the library (or Internet) and good will hunting an education, but I can basically guarantee that neither you nor anyone you know or will ever know is one of them.

load more comments (1 replies)
[–] taxon@lemmy.world 6 points 2 weeks ago
load more comments
view more: next ›