1024
top 50 comments
sorted by: hot top controversial new old
[-] Virkkunen@fedia.io 178 points 5 months ago

Don't worry folks, if we all stop using plastic straws and take 30 second showers, we'll be able to offset 5% of the carbon emissions this AI has!

[-] daniskarma@lemmy.dbzer0.com 33 points 5 months ago* (last edited 5 months ago)

Google ghg emissions in 2023 are 14.3 million metric tons. Which are a ridiculous percentage of global emissions.

Commercial aviation emissions are 935.000 million metric tons by year.

So IDK about plastic straws or google. But really if people stopped flying around so much that would actually make a dent on global emissions.

Don't get me wrong, google is a piece of shit. But they are not the ones causing climate change, neither is AI technology. Planes, cars, meat industry, offshore production... Those are some of the truly big culprits.

[-] masquenox@lemmy.world 33 points 5 months ago

But they are not the ones causing climate change

The owners of google are capitalists. They are as responsible for climate change as any other capitalist.

load more comments (4 replies)
[-] umbrella@lemmy.ml 7 points 5 months ago

i cant afford to ride airplanes. you are welcome.

load more comments (2 replies)
load more comments (8 replies)
load more comments (6 replies)
[-] mctoasterson@reddthat.com 79 points 5 months ago

The annoying part is how many mainstream tech companies have ham-fisted AI into every crevice of every product. It isn't necessary and I'm not convinced it results in a "better search result" for 90% of the crap people throw into Google. Basic indexed searches are fine for most use cases.

[-] AlecSadler@sh.itjust.works 16 points 5 months ago

As a buzzword or whatever this is leagues worse than "agile", which I already loathed the overuse/integration of.

[-] xthexder@l.sw0.com 7 points 5 months ago

Before AI it was IoT. Nobody asked for an Internet connected toaster or fridge...

load more comments (2 replies)
[-] Raxiel@lemmy.world 72 points 5 months ago

If only Google had a working search engine before AI

[-] Ragnarok314159@sopuli.xyz 50 points 5 months ago

Yes, but now we can get much worse results and three pages of ads for ten times the energy cost. Capitalism at its finest.

[-] set_secret@lemmy.world 59 points 5 months ago

And yet it's still garbage....like their search

[-] stebo02@lemmy.dbzer0.com 14 points 5 months ago

With adblock enabled I feel like their results are often better than for example Duckduckgo. I recently switched to using DDG as my standard search engine but I regularly find myself using Google instead to get the results I'm looking for.

[-] Ledivin@lemmy.world 7 points 5 months ago

Interesting, I'm actually the exact opposite. I always start with Google, because it's usually good enough, but whenever it takes 2-3 tries to get something relevant, I switch to ddg and get it first try.

[-] stebo02@lemmy.dbzer0.com 9 points 5 months ago* (last edited 5 months ago)

My issue is mostly with image search results. DDG's images tend to be less relevant than Google's. DDG also lacks "smart" results (idk the official term).

For example when you search "rng 25" on Google, it will immediately present you with a random number between 1 and 25. On DDG you have to click on one of the search results and then use some website to generate the number.

Or when searching for the results of a soccer game, Google will immediately present all the stats to you, while on DDG you will only find some articles about it.

Of course it really depends on the kind of search and I'm sure DDG will regularly have better results than Google too.

load more comments (8 replies)
[-] lone_faerie@lemmy.blahaj.zone 44 points 5 months ago

AI is just what crypto bros moved onto after people realized that was a scam. It's immature technology that uses absurd amounts of energy for a solution in search of a problem, being pushed as the future, all for the prospect of making more money. Except this time it's being backed by major corporations because it means fewer employees they have to pay.

[-] pycorax@lemmy.world 10 points 5 months ago

There are legitimate uses of AI in certain fields like medical research and 3D reconstruction that aren't just a scam. However, most of these are not consumer facing and the average person won't really hear about them.

It's unfortunate that what you said is very true on the consumer side of things...

load more comments (1 replies)
load more comments (7 replies)
[-] darkevilmac@lemmy.zip 43 points 5 months ago

I skimmed the article, but it seems to be assuming that Google's LLM is using the same architecture as everyone else. I'm pretty sure Google uses their TPU chips instead of a regular GPU like everyone else. Those are generally pretty energy efficient.

That and they don't seem to be considering how much data is just being cached for questions that are the same. And a lot of Google searches are going to be identical just because of the search suggestions funneling people into the same form of a question.

[-] kromem@lemmy.world 16 points 5 months ago

Exactly. The difference between a cached response and a live one even for non-AI queries is an OOM difference.

At this point, a lot of people just care about the 'feel' of anti-AI articles even if the substance is BS though.

And then people just feed whatever gets clicks and shares.

load more comments (2 replies)
[-] AlecSadler@sh.itjust.works 12 points 5 months ago

I hadn't really heard of the TPU chips until a couple weeks ago when my boss told me about how he uses USB versions for at-home ML processing of his closed network camera feeds. At first I thought he was using NVIDIA GPUs in some sort of desktop unit and just burning energy...but I looked the USB things up and they're wildly efficient and he says they work just fine for his applications. I was impressed.

[-] darkevilmac@lemmy.zip 8 points 5 months ago

Yeah they're pretty impressive for some at home stuff and they're not even that costly.

[-] dan@upvote.au 8 points 5 months ago

The Coral is fantastic for use cases that don't need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.

It runs Tensorflow Lite, so you can also build your own models.

Pretty good for a $25 device!

[-] dan@upvote.au 6 points 5 months ago* (last edited 5 months ago)

I'm pretty sure Google uses their TPU chips

The Coral ones? They don't have nearly enough RAM to handle LLMs - they only have 8MB RAM and only support small Tensorflow Lite models.

Google might have some custom-made non-public chips though - a lot of the big tech companies are working on that.

instead of a regular GPU

I wouldn't call them regular GPUs... AI use cases often use products like the Nvidia H100, which are specifically designed for AI. They don't have any video output ports.

[-] jj4211@lemmy.world 33 points 5 months ago

The confounding part is that when I do get offered an "AI result", it's basically identical to the excerpt in the top "traditional search" result. It wasted a fair amount more time and energy to repeat what the top of the search said anyway. I've never seen the AI overview ever be more useful than the top snippet.

[-] Facebones@reddthat.com 32 points 5 months ago* (last edited 5 months ago)

Its not even hidden, people just give zero fucks about how their magical rectangle works and get mad if you try to tell them.

[-] blackwateropeth@lemmy.world 31 points 5 months ago

And it’s only 10x more useless :)

[-] PanArab@lemm.ee 28 points 5 months ago

The results used to be better too. AI just produces junk faster.

load more comments (14 replies)
[-] ArchRecord@lemm.ee 25 points 5 months ago

If only they did what DuckDuckGo did and made it so it only popped up in very specific circumstances, primarily only drawing from current summarized information from Wikipedia in addition to its existing context, and allowed the user to turn it off completely in one click of a setting toggle.

I find it useful in DuckDuckGo because it's out of the way, unobtrusive, and only pops up when necessary. I've tried using Google with its search AI enabled, and it was the most unusable search engine I've used in years.

[-] jfx@discuss.tchncs.de 11 points 5 months ago

DDG has also gotten much worse since the introduction of AI features.

load more comments (1 replies)
[-] KillingTimeItself@lemmy.dbzer0.com 23 points 5 months ago

whats up with these shit ass titles? It's not even REMOTELY hidden, it takes two fucking seconds of googling to figure this shit out.

The entire AI industry was dependent on GPU hardware manufacturers, and nvidia is STILL back ordered (to my knowledge)

This is like saying that crypto has a hidden energy cost.

[-] Halcyon@discuss.tchncs.de 11 points 5 months ago* (last edited 5 months ago)

It's hidden in the sense that the normal user does not see the true cost on their energy bill. You perform a search and get the result in milliseconds. That makes it easy to get the false impression that it's just a minor operation. It's not like driving a car and watching the the fuel gauge and see the consumption.

Of course one can research how much energy Google consumes and find out the background – IF you're interested. But most people just use tech and do not question or even understand.

load more comments (1 replies)
load more comments (22 replies)
[-] just_another_person@lemmy.world 18 points 5 months ago* (last edited 5 months ago)

To be fair, it was never "hidden" since all the top 5 decided that GPU was the way to go with this monetization.

Guess who is waiting on the other side of this idiocy with a solution? AMD with cheap FPGA that will do all this work at 10x the speed and similar energy reduction. At a massive fraction of the cost and hassle for cloud providers.

load more comments (1 replies)
[-] repungnant_canary@lemmy.world 14 points 5 months ago

I'm genuinely curious where their penny picking went? All of tech companies shove ads into our throats and steal our privacy justifying that by saying they operate at loss and need to increase income. But suddenly they can afford spending huge amounts on some shit that won't give them any more income. How do they justify it then?

[-] conciselyverbose@sh.itjust.works 7 points 5 months ago* (last edited 5 months ago)

It's another untapped market they can monopolize. (Or just run at a loss because investors are happy with another imaginary pot of gold at the end of another rainbow.)

load more comments (4 replies)
[-] homesweethomeMrL@lemmy.world 7 points 5 months ago

Wow AI is just so amazing

[-] afraid_of_zombies@lemmy.world 7 points 5 months ago

This is terrible. Why don't we build nuclear power plants, rollout a carbon tax, and put incentives for companies to make their own energy via renewables?

You know the shit that we should have been doing before I was born.

[-] JohnDClay@sh.itjust.works 6 points 5 months ago

I'm surprised it's only 10x. Running a prompt though a llm takes quite a bit of energy, so I guess even the regular searches take more energy than I thought.

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 06 Jul 2024
1024 points (97.3% liked)

Technology

59889 readers
2493 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS