546
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 30 Oct 2023
546 points (94.8% liked)
Technology
60035 readers
4193 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
Some days it looks to be a three-way race between AI, climate change, and nuclear weapons proliferation to see who wipes out humanity first.
But on closer inspection, you see that humans are playing all three sides, and still we are losing.
One of those is not like the others. Nuclear weapons can wipe out humanity at any minute right now. Climate change has been starting the job of wiping out humanity for a while now. When and how is AI going to wipe out humanity?
This is not a criticism directed at you, by the way. It's just a frustration that I keep hearing about AI being a threat to humanity and it just sounds like a far-fetched idea. It almost seems like it's being used as a way to distract away from much more critically pressing issues like the myriad of environmental issues that we are already deep into, not just climate change. I wonder who would want to distract from those? Oil companies would definitely be number 1 in the list of suspects.
Agreed. This kind of debate is about as pointless as declaring self-driving cars are coming out in 5 years. The tech is way too far behind right now, and it's not useful to even talk about it until 50 years from now.
For fuck's sake, just because a chatbot can pretend it's sentient doesn't mean it actually is sentient.
Here. Here's the real lead. Google has been scared of AI open source because they can't profit off of freely available tools. Now, they want to change the narrative, so that the government steps in regulates their competition. Of course, their highly-paid lobbyists will by right there to write plenty of loopholes and exceptions to make sure only the closed-source corpos come out on top.
Fear. Uncertainty. Doubt. Oldest fucking trick in the book.
With nuclear weapons and climate change.
Uh nice a crossover episode for the series finale.
Is this a crossover episode??!
The two things experts said shouldn’t be done with AI, allow open internet access and teaching them to code, have been blithely ignored already. It’s just a matter of time.
I don't think the oil companies are behind these articles. That is very much a wheels within wheels type thinking that corporations don't generally invest in. It is easier to just deny climate change instead of getting everyone distracted by something else.
You're probably right, but I just wonder where all this AI panic is coming from. There was a story on the Washington Post a few weeks back saying that millions are being invested into university groups that are studying the risks of AI. It just seems that something is afoot that doesn't look like just a natural reaction or overreaction. Perhaps this story itself explains it: the Big Tech companies trying to tamp down competition from startups.
It is coming from ratings and click based economy. Panic sells so they sell panic. No one is going to click an article titled "everything mostly fine".
Because of dumb fucks using ChatGPT to do unethical and illegal shit, like fraudulently create works that mimic a writer and claim it's that writer's work to sell for cash, blatant copyright infringement, theft, cheating on homework and tests, all sorts of dumbassery.