1024
top 50 comments
sorted by: hot top controversial new old
[-] Virkkunen@fedia.io 179 points 1 year ago

Don't worry folks, if we all stop using plastic straws and take 30 second showers, we'll be able to offset 5% of the carbon emissions this AI has!

[-] daniskarma@lemmy.dbzer0.com 33 points 1 year ago* (last edited 1 year ago)

Google ghg emissions in 2023 are 14.3 million metric tons. Which are a ridiculous percentage of global emissions.

Commercial aviation emissions are 935.000 million metric tons by year.

So IDK about plastic straws or google. But really if people stopped flying around so much that would actually make a dent on global emissions.

Don't get me wrong, google is a piece of shit. But they are not the ones causing climate change, neither is AI technology. Planes, cars, meat industry, offshore production... Those are some of the truly big culprits.

[-] masquenox@lemmy.world 33 points 1 year ago

But they are not the ones causing climate change

The owners of google are capitalists. They are as responsible for climate change as any other capitalist.

load more comments (4 replies)
load more comments (11 replies)
load more comments (6 replies)
[-] mctoasterson@reddthat.com 79 points 1 year ago

The annoying part is how many mainstream tech companies have ham-fisted AI into every crevice of every product. It isn't necessary and I'm not convinced it results in a "better search result" for 90% of the crap people throw into Google. Basic indexed searches are fine for most use cases.

[-] AlecSadler@sh.itjust.works 16 points 1 year ago

As a buzzword or whatever this is leagues worse than "agile", which I already loathed the overuse/integration of.

[-] xthexder@l.sw0.com 7 points 1 year ago

Before AI it was IoT. Nobody asked for an Internet connected toaster or fridge...

load more comments (2 replies)
[-] Raxiel@lemmy.world 72 points 1 year ago

If only Google had a working search engine before AI

[-] Ragnarok314159@sopuli.xyz 50 points 1 year ago

Yes, but now we can get much worse results and three pages of ads for ten times the energy cost. Capitalism at its finest.

[-] set_secret@lemmy.world 59 points 1 year ago

And yet it's still garbage....like their search

[-] lone_faerie@lemmy.blahaj.zone 44 points 1 year ago

AI is just what crypto bros moved onto after people realized that was a scam. It's immature technology that uses absurd amounts of energy for a solution in search of a problem, being pushed as the future, all for the prospect of making more money. Except this time it's being backed by major corporations because it means fewer employees they have to pay.

[-] pycorax@lemmy.world 10 points 1 year ago

There are legitimate uses of AI in certain fields like medical research and 3D reconstruction that aren't just a scam. However, most of these are not consumer facing and the average person won't really hear about them.

It's unfortunate that what you said is very true on the consumer side of things...

load more comments (1 replies)
load more comments (7 replies)
[-] darkevilmac@lemmy.zip 43 points 1 year ago

I skimmed the article, but it seems to be assuming that Google's LLM is using the same architecture as everyone else. I'm pretty sure Google uses their TPU chips instead of a regular GPU like everyone else. Those are generally pretty energy efficient.

That and they don't seem to be considering how much data is just being cached for questions that are the same. And a lot of Google searches are going to be identical just because of the search suggestions funneling people into the same form of a question.

[-] kromem@lemmy.world 16 points 1 year ago

Exactly. The difference between a cached response and a live one even for non-AI queries is an OOM difference.

At this point, a lot of people just care about the 'feel' of anti-AI articles even if the substance is BS though.

And then people just feed whatever gets clicks and shares.

load more comments (2 replies)
[-] AlecSadler@sh.itjust.works 12 points 1 year ago

I hadn't really heard of the TPU chips until a couple weeks ago when my boss told me about how he uses USB versions for at-home ML processing of his closed network camera feeds. At first I thought he was using NVIDIA GPUs in some sort of desktop unit and just burning energy...but I looked the USB things up and they're wildly efficient and he says they work just fine for his applications. I was impressed.

[-] dan@upvote.au 8 points 1 year ago

The Coral is fantastic for use cases that don't need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.

It runs Tensorflow Lite, so you can also build your own models.

Pretty good for a $25 device!

[-] darkevilmac@lemmy.zip 8 points 1 year ago

Yeah they're pretty impressive for some at home stuff and they're not even that costly.

[-] dan@upvote.au 6 points 1 year ago* (last edited 1 year ago)

I'm pretty sure Google uses their TPU chips

The Coral ones? They don't have nearly enough RAM to handle LLMs - they only have 8MB RAM and only support small Tensorflow Lite models.

Google might have some custom-made non-public chips though - a lot of the big tech companies are working on that.

instead of a regular GPU

I wouldn't call them regular GPUs... AI use cases often use products like the Nvidia H100, which are specifically designed for AI. They don't have any video output ports.

[-] jj4211@lemmy.world 33 points 1 year ago

The confounding part is that when I do get offered an "AI result", it's basically identical to the excerpt in the top "traditional search" result. It wasted a fair amount more time and energy to repeat what the top of the search said anyway. I've never seen the AI overview ever be more useful than the top snippet.

[-] Facebones@reddthat.com 32 points 1 year ago* (last edited 1 year ago)

Its not even hidden, people just give zero fucks about how their magical rectangle works and get mad if you try to tell them.

[-] blackwateropeth@lemmy.world 31 points 1 year ago

And it’s only 10x more useless :)

[-] PanArab@lemm.ee 28 points 1 year ago

The results used to be better too. AI just produces junk faster.

load more comments (14 replies)
[-] ArchRecord@lemm.ee 25 points 1 year ago

If only they did what DuckDuckGo did and made it so it only popped up in very specific circumstances, primarily only drawing from current summarized information from Wikipedia in addition to its existing context, and allowed the user to turn it off completely in one click of a setting toggle.

I find it useful in DuckDuckGo because it's out of the way, unobtrusive, and only pops up when necessary. I've tried using Google with its search AI enabled, and it was the most unusable search engine I've used in years.

[-] jfx@discuss.tchncs.de 11 points 1 year ago

DDG has also gotten much worse since the introduction of AI features.

load more comments (1 replies)

whats up with these shit ass titles? It's not even REMOTELY hidden, it takes two fucking seconds of googling to figure this shit out.

The entire AI industry was dependent on GPU hardware manufacturers, and nvidia is STILL back ordered (to my knowledge)

This is like saying that crypto has a hidden energy cost.

[-] Halcyon@discuss.tchncs.de 11 points 1 year ago* (last edited 1 year ago)

It's hidden in the sense that the normal user does not see the true cost on their energy bill. You perform a search and get the result in milliseconds. That makes it easy to get the false impression that it's just a minor operation. It's not like driving a car and watching the the fuel gauge and see the consumption.

Of course one can research how much energy Google consumes and find out the background – IF you're interested. But most people just use tech and do not question or even understand.

load more comments (1 replies)
[-] Ibuthyr@discuss.tchncs.de 6 points 1 year ago

For you it might be clear. For the overwhelming majority of people, this is news. People don't know shit about tech. Most would assume the AI thingy does its thing on the local computer.

load more comments (2 replies)
load more comments (19 replies)
[-] just_another_person@lemmy.world 18 points 1 year ago* (last edited 1 year ago)

To be fair, it was never "hidden" since all the top 5 decided that GPU was the way to go with this monetization.

Guess who is waiting on the other side of this idiocy with a solution? AMD with cheap FPGA that will do all this work at 10x the speed and similar energy reduction. At a massive fraction of the cost and hassle for cloud providers.

load more comments (1 replies)
[-] repungnant_canary@lemmy.world 14 points 1 year ago

I'm genuinely curious where their penny picking went? All of tech companies shove ads into our throats and steal our privacy justifying that by saying they operate at loss and need to increase income. But suddenly they can afford spending huge amounts on some shit that won't give them any more income. How do they justify it then?

[-] conciselyverbose@sh.itjust.works 7 points 1 year ago* (last edited 1 year ago)

It's another untapped market they can monopolize. (Or just run at a loss because investors are happy with another imaginary pot of gold at the end of another rainbow.)

load more comments (4 replies)
[-] afraid_of_zombies@lemmy.world 7 points 1 year ago

This is terrible. Why don't we build nuclear power plants, rollout a carbon tax, and put incentives for companies to make their own energy via renewables?

You know the shit that we should have been doing before I was born.

[-] homesweethomeMrL@lemmy.world 7 points 1 year ago

Wow AI is just so amazing

[-] DarkCloud@lemmy.world 6 points 1 year ago

If these guys gave a shit they'd focus on light based chips, which are in very early stages, but will save a lot of power.

load more comments (2 replies)
load more comments
view more: next ›
this post was submitted on 06 Jul 2024
1024 points (97.3% liked)

Technology

72958 readers
587 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS