273
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 14 Apr 2024
273 points (91.7% liked)
Futurology
1851 readers
68 users here now
founded 1 year ago
MODERATORS
You can't get GI through spicy autocorrect ? 😱
Lotta you meatbags are awful confident in your own complexity.
Apparently not, given the content of this article
Even if the model stops here - did you imagine it'd get this far?
Humans do all their civilization brouhaha on three pounds of wet meat powered by corn flakes. Most of which evolved for marginal improvements on "grab branch and pull" or "do not pet tiger." It's a cosmic accident that's given us language and music and dubstep. And this stupid trick with a pile of video cards can fake a lot of that, to the point we're worried the average human will be able to spot the fakes.
Point being: the miraculous birth of a computer intellect may well arise from "the fact blender." Or "fancy Wikipedia." Or "twenty questions, hard mode." Or any other stupid gimmick that some grad students can cobble together after a 4 AM what-if. Calling this hot mess "spicy autocorrect" is accurate, and in some sense damning, but we had no fucking idea where it'd stop. Emergent properties are chaos. Approximate knowledge of conditions cannot predict approximate outcomes.
LLMs are still liable to figure out math. That's a process which gigabytes of linear algebra can obviously do, which would massively improve its ability to guess the next letter in a word problem. It won't be the kind of AI you can explain calculus to, and then expect it to remember, next time - but getting any portion of the way there is deeply spooky.
Dude you're a poet
You can get adjusted gross income