Don’t they reflect how you talk to them? Ie: my chatgpt doesn’t have a sense of humor, isn’t sarcastic or sad. It only uses formal language and doesn’t use emojis. It just gives me ideas that I do trial and error with.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
While this is pretty hilarious LLMs don't actually "know" anything in the usual sense of the word. An LLM, or a Large Language Model is a basically a system that maps "words" to other "words" to allow a computer to understand language. IE all an LLM knows is that when it sees "I love" what probably comes next is "my mom|my dad|ect". Because of this behavior, and the fact we can train them on the massive swath of people asking questions and getting awnsers on the internet LLMs essentially by chance are mostly okay at "answering" a question but really they are just picking the next most likely word over and over from their training which usually ends up reasonably accurate.
I wouldn't be surprised if that is true outside the US as well. People that actually (have to) work with the stuff usually quickly learn, that its only good at a few things, but if you just hear about it in the (pop-, non-techie-)media (including YT and such), you might be deceived into thinking Skynet is just a few years away.
I don't think a single human who knows as much as chatgpt does exists. Does that mean chatgpt is smarter then everyone? No. Obviously not based on what we've seen so far. But the amount of information available to these LLMs is incredible and can be very useful. Like a library contains a lot of useful information but isn't intelligent itself.
Intelligence and knowledge are two different things. Or, rather, the difference between smart and stupid people is how they interpret the knowledge they acquire. Both can acquire knowledge, but stupid people come to wrong conclusions by misinterpreting the knowledge. Like LLMs, 40% of the time, apparently.
There’s a lot of ignorant people out there so yeah, technically LLM is smarter than most people.
AI is essentially the human superid. No one man could ever be more knowledgeable. Being intelligent is a different matter.
Just a thought, perhaps instead of considering the mental and educational state of the people without power to significantly affect this state, we should focus on the people who have power.
For example, why don't LLM providers explicitly and loudly state, or require acknowledgement, that their products are just imitating human thought and make significant mistakes regularly, and therefore should be used with plenty of caution?
It's a rhetorical question, we know why, and I think we should focus on that, not on its effects. It's also much cheaper and easier to do than refill years of quality education in individuals heads.
What a very unfortunate name for a university.
Aside from the unfortunate name of the university, I think that part of why LLMs may be perceived as smart or 'smarter' is because they are very articulate and, unless prompted otherwise, use proper spelling and grammar, and tend to structure their sentences logically.
Which 'smart' humans may not do, out of haste or contextual adaptation.
I wasn't sure from the title if it was "Nearly half of U.S. adults believe LLMs are smarter than [the US adults] are." or "Nearly half of U.S. adults believe LLMs are smarter than [the LLMs actually] are." It's the former, although you could probably argue the latter is true too.
Either way, I'm not surprised that people rate LLMs intelligence highly. They obviously have limited scope in what they can do, and hallucinating false info is a serious issue, but you can ask them a lot of questions that your typical person couldn't answer and get a decent answer. I feel like they're generally good at meeting what people's expectations are of a "smart person", even if they have major shortcomings in other areas.
The funny thing about this scenario is by simply thinking that’s true, it actually becomes true.