1760
we are safe
(discuss.tchncs.de)
Post funny things about programming here! (Or just rant about your favourite programming language.)
did we really regress back from that?
i mean giving a confidence for recognizing a certain object in a picture is relatively straightforward.
But LLMs put together words by their likeliness of belonging together under your input (terribly oversimplified).the confidence behind that has no direct relation to how likely the statements made are true. I remember an example where someone made chatgpt say that 2+2 equals 5 because his wife said so. So chatgpt was confident that something is right when the wife says it, simply because it thinks these words to belong together.
ChatGPT what is the Gödel number for the proof of 2+2=5?
Of course I don't know enough about the actual proof for it to be anything but a joke but there are infinite numbers so there should be infinite proofs.
there are also meme proofs out there I assume could be given a Gödel number easily enough.