[-] Nalivai@lemmy.world 1 points 5 hours ago

Yeah, the scary thing about LLMs is that by their very nature they sound convincing and it's very easy to fall into a trap, we as humans are hardwired to misconstrue the ability to talk smoothly for intelligence, and when computer started to speak with complete sentences and hold the immediate context of a conversation, we immediately started to think that we have a thinking machine and started believing it.
The worst thing is, there are legit uses for all the machine learning stuff and LLMs in particular, so we can't just throw it all out of the window, we will have to collectively adapt to this very convincing randomness machine that is just here all the time

[-] Nalivai@lemmy.world 2 points 11 hours ago* (last edited 11 hours ago)

As someone with degrees and decades of experience, I urge you not use it for that. It's a cleverly disguised randomness machine, it will give you incorrect information that will be indistinguishable from truth because "truth" is never the criteria that it can use, but be convincing is. It will seed those untruths into you and unlearning bad practices that you picked up at the beginning might take years and cost you a career. And since you're just starting, you have no idea how to pick up bullshit from truth as long as the final result seem to work, and that's the works way to hide the bullshit from you.
The field is already very accessible for everyone who wants to learn it, the amount of guides, examples, teaching courses, very useful youtube videos with thick Indian accent is already enormous, and most of them are at least trying to self-correct, while LLM actively doesn't, in fact it's trying to do the opposite.
Best case scenario you're learning inefficiently, worst case scenario you aren't learning at all

[-] Nalivai@lemmy.world 5 points 15 hours ago

Yeah, because they need to convince people in the middle to vote for them, and people in the middle are stupid and racist.

[-] Nalivai@lemmy.world 11 points 15 hours ago

Don't overestimate LLMs, it can't code and never will be. It can create templates convincingly enough and do boilerplate parts that are nonsense only sometimes, but those aren't the fun parts of the coding process anyway. In my experience, LLM isn't helping at all and I spend more time fixing it's nonsense than I would do if I don't use it at all, so I don't

[-] Nalivai@lemmy.world 1 points 23 hours ago

Oh yeah, he cared about democracy very much, not in a way we would thought.

[-] Nalivai@lemmy.world 1 points 23 hours ago

The idea of work under soviet regime so much beloved by tankies, however, is "do useless and inefficient work that you didn't chose and you don't have any say in, or be thrown in jail" which hits way different

[-] Nalivai@lemmy.world 2 points 1 day ago

And for the crime of being send to death they should be punished by not having the newest divers. That'll show them.

[-] Nalivai@lemmy.world 14 points 1 day ago

But social media don't have to burn tar. They chose to because this way they can get more money, but it's not an inherent part of the system, it's an exploitation of it for profit, and can be separated

[-] Nalivai@lemmy.world 5 points 2 days ago

I bike and rock climb, I walk long walks and overall in a good shape, not great, not terrible. When the doctors see my bmi without other metrics, they immediately tell me to lose weight and don't take anything else seriously. I missed very serious illness because of that, every symptom I had was thrown into a pile of "your bmi is bad, lose weight", until one doctor was smart enough to check on me for real.
BMI is incredibly oversimplified and gives lazy or overworked doctors easy way out of doing their jobs, which kills people.

[-] Nalivai@lemmy.world 7 points 2 days ago

But by now he has enough money to fail upwards with zero skills

[-] Nalivai@lemmy.world 13 points 2 days ago

corn
order corn
google corn search

view more: next ›

Nalivai

joined 1 year ago