858
Clever, clever (mander.xyz)
top 50 comments
sorted by: hot top controversial new old
[-] ITGuyLevi@programming.dev 6 points 57 minutes ago

Is it invisible to accessibility options as well? Like if I need a computer to tell me what the assignment is, will it tell me to do the thing that will make you think I cheated?

[-] Sauerkraut@discuss.tchncs.de 2 points 39 minutes ago

Disability accomodation requests are sent to the professor at the beginning of each semester so he would know which students use accessibility tools

[-] technocrit@lemmy.dbzer0.com 1 points 3 minutes ago

Ok but will those students also be deceived?

[-] technocrit@lemmy.dbzer0.com 1 points 5 minutes ago

Maybe if homework can be done by statistics, then it's not worth doing.

Maybe if a "teacher" has to trick their students in order to enforce pointless manual labor, then it's not worth doing.

Schools are not about education but about privilege, filtering, indoctrination, control, etc.

[-] HawlSera@lemm.ee 31 points 5 hours ago

I wish more teachers and academics would do this, because I"m seeing too many cases of "That one student I pegged as not so bright because my class is in the morning and they're a night person, has just turned in competent work. They've gotta be using ChatGPT, time to report them for plagurism. So glad that we expell more cheaters than ever!" and similar stories.

Even heard of a guy who proved he wasn't cheating, but was still reported anyway simply because the teacher didn't want to look "foolish" for making the accusation in the first place.

[-] Navarian@lemm.ee 51 points 7 hours ago

For those that didn't see the rest of this tweet, Frankie Hawkes is in fact a dog. A pretty cute dog, for what it's worth.

[-] Etterra@lemmy.world 31 points 7 hours ago

Ah yes, pollute the prompt. Nice. Reminds me of how artists are starting to embed data and metadata in their pieces that fuck up AI training data.

[-] lepinkainen@lemmy.world 12 points 2 hours ago

And all maps have fake streets in them so you can tell when someone copied it

[-] Sabre363@sh.itjust.works 62 points 12 hours ago

Easily by thwarted by simply proofreading your shit before you submit it

[-] abbadon420@lemm.ee 10 points 2 hours ago

But that's fine than. That shows that you at least know enough about the topic to realise that those topics should not belong there. Otherwise you could proofread and see nothing wrong with the references

[-] xantoxis@lemmy.world 53 points 8 hours ago* (last edited 8 hours ago)

Is it? If ChatGPT wrote your paper, why would citations of the work of Frankie Hawkes raise any red flags unless you happened to see this specific tweet? You'd just see ChatGPT filled in some research by someone you hadn't heard of. Whatever, turn it in. Proofreading anything you turn in is obviously a good idea, but it's not going to reveal that you fell into a trap here.

If you went so far as to learn who Frankie Hawkes is supposed to be, you'd probably find out he's irrelevant to this course of study and doesn't have any citeable works on the subject. But then, if you were doing that work, you aren't using ChatGPT in the first place. And that goes well beyond "proofreading".

[-] And009@reddthat.com 9 points 8 hours ago

This should be okay to do. Understanding and being able to process information is foundational

[-] psud@aussie.zone 5 points 5 hours ago* (last edited 5 hours ago)

LLMs can't cite. They don't know what a citation is other than a collection of text of a specific style

You'd be lucky if the number of references equalled the number of referenced items even if you were lucky enough to get real sources out of an LLM

If the student is clever enough to remove the trap reference, the fact that the other references won't be in the University library should be enough to sink the paper

[-] uis@lemm.ee 1 points 1 hour ago* (last edited 1 hour ago)

LLMs can't cite. They don't know what a citation is other than a collection of text of a specific style

LLMs can cite. It's called Retrival-Augmented Generation. Basically LLM that can do Information Retrival, which is just academic term for search engines.

You'd be lucky if the number of references equalled the number of referenced items even if you were lucky enough to get real sources out of an LLM

You can just print retrival logs into references. Well, kinda stretching definition of "just".

[-] auzy@lemmy.world 9 points 4 hours ago* (last edited 4 hours ago)

They can. There was that court case where the cases cited were made up by chatgpt. Upon investigation it was discovered it was all hallucinated by chatgpt and the lawyer got into deep crap

[-] yamanii@lemmy.world 57 points 12 hours ago

There are professional cheaters and there are lazy ones, this is gonna get the lazy ones.

[-] MalditoBarbudo@programming.dev 21 points 7 hours ago

I wouldn't call "professional cheaters" to the students that carefully proofread the output. People using chatgpt and proofreading content and bibliography later are using it as a tool, like any other (Wikipedia, related papers...), so they are not cheating. This hack is intended for the real cheaters, the ones that feed chatgpt with the assignment and return whatever hallucination it gives to you without checking anything else.

load more comments
view more: next ›
this post was submitted on 26 Oct 2024
858 points (97.8% liked)

Science Memes

10783 readers
3124 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.


Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS