130
submitted 2 weeks ago* (last edited 2 weeks ago) by Timely_Jellyfish_2077@programming.dev to c/chatgpt@lemmy.world

Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn't know the answer, it would have been trustworthy.

you are viewing a single comment's thread
view the rest of the comments
[-] kromem@lemmy.world 1 points 2 weeks ago

The problem is that they are prone to making up why they are correct too.

There's various techniques to try and identify and correct hallucinations, but they all increase the cost and none are a silver bullet.

But the rate at which it occurs decreased with the jump in pretrained models, and will likely decrease further with the next jump too.

this post was submitted on 29 Jun 2024
130 points (91.1% liked)

ChatGPT

8667 readers
2 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS