this post was submitted on 20 Jul 2023
0 points (NaN% liked)

Technology

165 readers
1 users here now

This magazine is dedicated to discussions on the latest developments, trends, and innovations in the world of technology. Whether you are a tech enthusiast, a developer, or simply curious about the latest gadgets and software, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as artificial intelligence, robotics, cloud computing, cybersecurity, and more. From the impact of technology on society to the ethical considerations of new technologies, this category covers a wide range of topics related to technology. Join the conversation and let's explore the ever-evolving world of technology together!

founded 2 years ago
 

"The chatbot gave wildly different answers to the same math problem, with one version of ChatGPT even refusing to show how it came to its conclusion."

It's getting worse. And because it's a black box model they don't know why. The computer science professor here likens it to how human students make mistakes... but human students make mistakes because they don't have perfect recall, mishear things being told to them, are tired and/or not paying attention... A bunch of reason that basically relate to having a human body that needs food, rest and water. A thing a computer does not have.

The only reason ChatGPT should be getting math wrong is that it's getting inputs that are wrong, but without view into it they can't figure out where it's getting it wrong and who told it the wrong info.

top 14 comments
sorted by: hot top controversial new old
[–] FermatsLastAccount@kbin.social 1 points 2 years ago* (last edited 2 years ago) (2 children)

It's almost certainly because OpenAI is throwing less computing power at it in order to decrease the cost.

[–] argv_minus_one@beehaw.org 0 points 2 years ago (1 children)

It's enshittification, then.

[–] BraveSirZaphod@kbin.social 0 points 2 years ago (1 children)

I mean, they've gotta to be blowing absurd amounts of money at it. It's not remotely cheap to build a massively complicated web service at that scale, and eventually the numbers need to start adding up. I'm sure they have several good monetization plans, but not every instance of a business attempting to stop hemorraghing money is a conspiracy. You'd be doing the exact same thing in their shoes.

[–] Ragnell@kbin.social 1 points 2 years ago* (last edited 2 years ago)

Enshittification is not a conspiracy because a conspiracy requires communication and planning. Enshittification is just how idiots act when trying to make money.

[–] CarrieForle@kbin.social 0 points 2 years ago (1 children)

And there are more and more offline GPT AIs available for free. Now everyone with an above average computer can have their own chatGPT.

[–] BarbecueCowboy@kbin.social 1 points 2 years ago* (last edited 2 years ago)

It's still pretty rough to selfhost an LLM. You can get one that's kind of okay on an average computer, but to get a really competitive one running locally at a good speed, you need a huge amount of RAM that is still beyond most average users (VRAM for GPU based projects).

I've been trying to get Vicuna going and the RAM usage is rough, 60gb is suggested, and I've got 64 and I think I need a lot more honestly.

[–] Frog-Brawler@kbin.social 0 points 2 years ago (1 children)

Huh… so after months of being exposed to people that aren’t quite as smart as world class computer scientists and engineers, it gets dumber. Maybe it’s more human that I previously thought.

[–] paper_clip@kbin.social 1 points 2 years ago* (last edited 2 years ago)

it gets dumber

In six months, ChatGPT will be talking up Brawndo, because it's got the electrolytes that plants crave.

[–] WeDoTheWeirdStuff@kbin.social 0 points 2 years ago (1 children)

You think that’s bad? My calculator can’t even finish a simple sentence.

[–] somniumx@kbin.social 0 points 2 years ago (1 children)
[–] effingjoe@kbin.social 0 points 2 years ago (1 children)

It's not, but Bob bobs is.

[–] palordrolap@kbin.social 0 points 2 years ago (1 children)

A single word can be a full sentence, unless answers to either/or questions are not sentences.

Or is this one of those logic things where a train is only a train when the railway engine is connected to something?

[–] effingjoe@kbin.social 0 points 2 years ago (1 children)

A sentence needs a subject and a verb, if I remember grade school. Fun fact: "I'm." is a sentence. There can be an implied "You" in there. Like "[You] Stop!" or "[You] Go!"

[–] palordrolap@kbin.social 1 points 2 years ago* (last edited 2 years ago)

The verb can be implied too. "Would you like mashed potatoes or fries?"

"[I would like] Fries."

There's also the joke sentence(?): "This sentence(,) no verb."