488
submitted 2 months ago by cm0002@lemmy.world to c/technology@lemmy.world
top 50 comments
sorted by: hot top controversial new old
[-] aramis87@fedia.io 33 points 2 months ago

The biggest problem with AI is that they're illegally harvesting everything they can possibly get their hands on to feed it, they're forcing it into places where people have explicitly said they don't want it, and they're sucking up massive amounts of energy AMD water to create it, undoing everyone else's progress in reducing energy use, and raising prices for everyone else at the same time.

Oh, and it also hallucinates.

[-] wewbull@feddit.uk 9 points 2 months ago

Oh, and it also hallucinates.

Oh, and people believe the hallucinations.

[-] pennomi@lemmy.world 7 points 2 months ago

Eh I’m fine with the illegal harvesting of data. It forces the courts to revisit the question of what copyright really is and hopefully erodes the stranglehold that copyright has on modern society.

Let the companies fight each other over whether it’s okay to pirate every video on YouTube. I’m waiting.

[-] catloaf@lemm.ee 29 points 2 months ago

So far, the result seems to be "it's okay when they do it"

load more comments (1 replies)
[-] Electricblush@lemmy.world 10 points 2 months ago* (last edited 2 months ago)

I would agree with you if the same companies challenging copyright (protecting the intellectual and creative work of "normies") are not also aggressively welding copyright against the same people they are stealing from.

With the amount of coprorate power tightly integrated with the governmental bodies in the US (and now with Doge dismantling oversight) I fear that whatever comes out of this is humans own nothing, corporations own everything. Death of free independent thought and creativity.

Everything you do, say and create is instantly marketable, sellable by the major corporations and you get nothing in return.

The world needs something a lot more drastic then a copyright reform at this point.

load more comments (1 replies)
[-] naught@sh.itjust.works 4 points 2 months ago

AI scrapers illegally harvesting data are destroying smaller and open source projects. Copyright law is not the only victim

https://thelibre.news/foss-infrastructure-is-under-attack-by-ai-companies/

load more comments (4 replies)
[-] index@sh.itjust.works 4 points 2 months ago

We spend energy on the most useless shit why are people suddenly using it as an argument against AI? You ever saw someone complaining about pixar wasting energies to render their movies? Or 3D studios to render TV ads?

[-] Sturgist@lemmy.ca 4 points 2 months ago

Oh, and it also hallucinates.

This is arguably a feature depending on how you use it. I'm absolutely not an AI acolyte. It's highly problematic in every step. Resource usage. Training using illegally obtained information. This wouldn't necessarily be an issue if people who aren't tech broligarchs weren't routinely getting their lives destroyed for this, and if the people creating the material being used for training also weren't being fucked....just capitalism things I guess. Attempts by capitalists to cut workers out of the cost/profit equation.

If you're using AI to make music, images or video... you're depending on those hallucinations.
I run a Stable Diffusion model on my laptop. It's kinda neat. I don't make things for a profit, and now that I've played with it a bit I'll likely delete it soon. I think there's room for people to locally host their own models, preferably trained with legally acquired data, to be used as a tool to assist with the creative process. The current monetisation model for AI is fuckin criminal....

load more comments (6 replies)
[-] Aceticon@lemmy.dbzer0.com 3 points 2 months ago* (last edited 2 months ago)

It varies massivelly depending on the ML.

For example things like voice generation or object recognition can absolutelly be done with entirelly legit training datasets - literally pay a bunch of people to read some texts and you can train a voice generation engine with it and the work in object recognition is mainly tagging what's in the images on top of a ton of easilly made images of things - a researcher can literally go around taking photos to make their dataset.

Image generation, on the other hand, not so much - you can only go so far with just plain photos a researcher can just go around and take on the street and they tend to relly a lot on artistic work of people who have never authorized the use of their work to train them, and LLMs clearly cannot be do without scrapping billions of pieces of actual work from billions of people.

Of course, what we tend to talk about here when we say "AI" is LLMs, which are IMHO the worst of the bunch.

[-] kibiz0r@midwest.social 2 points 2 months ago

Well, the harvesting isn’t illegal (yet), and I think it probably shouldn’t be.

It’s scraping, and it’s hard to make that part illegal without collateral damage.

But that doesn’t mean we should do nothing about these AI fuckers.

In the words of Cory Doctorow:

Web-scraping is good, actually.

Scraping against the wishes of the scraped is good, actually.

Scraping when the scrapee suffers as a result of your scraping is good, actually.

Scraping to train machine-learning models is good, actually.

Scraping to violate the public’s privacy is bad, actually.

Scraping to alienate creative workers’ labor is bad, actually.

We absolutely can have the benefits of scraping without letting AI companies destroy our jobs and our privacy. We just have to stop letting them define the debate.

[-] Sl00k@programming.dev 2 points 2 months ago

I see the "AI is using up massive amounts of water" being proclaimed everywhere lately, however I do not understand it, do you have a source?

My understanding is this probably stems from people misunderstanding data center cooling systems. Most of these systems are closed loop so everything will be reused. It makes no sense to "burn off" water for cooling.

[-] lime@feddit.nu 9 points 2 months ago* (last edited 2 months ago)

data centers are mainly air-cooled, and two innovations contribute to the water waste.

the first one was "free cooling", where instead of using a heat exchanger loop you just blow (filtered) outside air directly over the servers and out again, meaning you don't have to "get rid" of waste heat, you just blow it right out.

the second one was increasing the moisture content of the air on the way in with what is basically giant carburettors in the air stream. the wetter the air, the more heat it can take from the servers.

so basically we now have data centers designed like cloud machines.

Edit: Also, apparently the water they use becomes contaminated and they use mainly potable water. here's a paper on it

[-] Aceticon@lemmy.dbzer0.com 2 points 2 months ago* (last edited 2 months ago)

Also the energy for those datacenters has to come from somewhere and non-renewable options (gas, oil, nuclear generation) also use a lot of water as part of the generation process itself (they all relly using the fuel to generate the steam to power turbines which generate the electricity) and for cooling.

[-] lime@feddit.nu 3 points 2 months ago

steam that runs turbines tends to be recirculated. that's already in the paper.

load more comments (1 replies)
load more comments (15 replies)
[-] futatorius@lemm.ee 17 points 2 months ago

Two intrinsic problems with the current implementations of AI is that they are insanely resource-intensive and require huge training sets. Neither of those is directly a problem of ownership or control, though both favor larger players with more money.

[-] finitebanjo@lemmy.world 10 points 2 months ago* (last edited 2 months ago)

And a third intrinsic problem is that the current models with infinite training data have been proven to never approach human language capability, from papers written by OpenAI in 2020 and Deepmind in 2022, and also a paper by Stanford which proposes AI simply have no emergent behavior and only convergent behavior.

So yeah. Lots of problems.

load more comments (1 replies)
load more comments (1 replies)
[-] AbsoluteChicagoDog@lemm.ee 16 points 2 months ago

Same as always. There is no technology capitalism can't corrupt

[-] max_dryzen@mander.xyz 11 points 2 months ago

The government likes concentrated ownership because then it has only a few phonecalls to make if it wants its bidding done (be it censorship, manipulation, partisan political chicanery, etc)

[-] futatorius@lemm.ee 2 points 2 months ago

And it's easier to manage and track a dozen bribe checks rather than several thousand.

[-] umbraroze@lemmy.world 11 points 2 months ago

AI business is owned by a tiny group of technobros, who have no concern for what they have to do to get the results they want ("fuck the copyright, especially fuck the natural resources") who want to be personally seen as the saviours of humanity (despite not being the ones who invented and implemented the actual tech) and, like all big wig biz boys, they want all the money.

I don't have problems with AI tech in the principle, but I hate the current business direction and what the AI business encourages people to do and use the tech for.

load more comments (1 replies)
[-] captain_aggravated@sh.itjust.works 11 points 2 months ago

For some reason the megacorps have got LLMs on the brain, and they're the worst "AI" I've seen. There are other types of AI that are actually impressive, but the "writes a thing that looks like it might be the answer" machine is way less useful than they think it is.

[-] ameancow@lemmy.world 3 points 2 months ago* (last edited 2 months ago)

most LLM's for chat, pictures and clips are magical and amazing. For about 4 - 8 hours of fiddling then they lose all entertainment value.

As for practical use, the things can't do math so they're useless at work. I write better Emails on my own so I can't imagine being so lazy and socially inept that I need help writing an email asking for tech support or outlining an audit report. Sometimes the web summaries save me from clicking a result, but I usually do anyway because the things are so prone to very convincing halucinations, so yeah, utterly useless in their current state.

I usually get some angsty reply when I say this by some techbro-AI-cultist-singularity-head who starts whinging how it's reshaped their entire lives, but in some deep niche way that is completely irrelevant to the average working adult.

I have also talked to way too many delusional maniacs who are literally planning for the day an Artificial Super Intelligence is created and the whole world becomes like Star Trek and they personally will become wealthy and have all their needs met. They think this is going to happen within the next 5 years.

[-] frezik@midwest.social 5 points 2 months ago

The delusional maniacs are going to be surprised when they ask the Super AI "how do we solve global warming?" and the answer is "build lots of solar, wind, and storage, and change infrastructure in cities to support walking, biking, and public transportation".

load more comments (1 replies)
[-] kibiz0r@midwest.social 11 points 2 months ago

Idk if it’s the biggest problem, but it’s probably top three.

Other problems could include:

  • Power usage
  • Adding noise to our communication channels
  • AGI fears if you buy that (I don’t personally)
[-] pennomi@lemmy.world 5 points 2 months ago

Dead Internet theory has never been a bigger threat. I believe that’s the number one danger - endless quantities of advertising and spam shoved down our throats from every possible direction.

load more comments (1 replies)
load more comments (1 replies)
[-] Guns0rWeD13@lemmy.world 8 points 2 months ago

brian eno is cooler than most of you can ever hope to be.

load more comments (2 replies)
[-] RememberTheApollo_@lemmy.world 7 points 2 months ago

And those people want to use AI to extract money and to lay off people in order to make more money.

That’s “guns don’t kill people” logic.

Yeah, the AI absolutely is a problem. For those reasons along with it being wrong a lot of the time as well as the ridiculous energy consumption.

[-] magic_smoke@lemmy.blahaj.zone 12 points 2 months ago

The real issues are capitalism and the lack of green energy.

If the arts where well funded, if people where given healthcare and UBI, if we had, at the very least, switched to nuclear like we should've decades ago, we wouldn't be here.

The issue isn't a piece of software.

load more comments (1 replies)
[-] Grimy@lemmy.world 4 points 2 months ago

AI has a vibrant open source scene and is definitely not owned by a few people.

A lot of the data to train it is only owned by a few people though. It is record companies and publishing houses winning their lawsuits that will lead to dystopia. It's a shame to see so many actually cheering them on.

[-] cyd@lemmy.world 3 points 2 months ago

So long as there are big players releasing open weights models, which is true for the foreseeable future, I don't think this is a big problem. Once those weights are released, they're free forever, and anyone can fine-tune based on them, or use them to bootstrap new models by distillation or synthetic RL data generation.

[-] finitebanjo@lemmy.world 3 points 2 months ago

I don't really agree that this is the biggest issue, for me the biggest issue is power consumption.

[-] CitricBase@lemmy.world 5 points 2 months ago

That is a big issue, but excessive power consumption isn't intrinsic to AI. You can run a reasonably good AI on your home computer.

The AI companies don't seem concerned about the diminishing returns, though, and will happily spend 1000% more power to gain that last 10% better intelligence. In a competitive market why wouldn't they, when power is so cheap.

load more comments (3 replies)
[-] KingThrillgore@lemmy.ml 3 points 2 months ago

He's not wrong.

[-] MyOpinion@lemm.ee 3 points 2 months ago

The problem with AI is that it pirates everyone’s work and then repackages it as its own and enriches the people that did not create the copywrited work.

[-] lobut@lemmy.ca 6 points 2 months ago

I mean, it's our work the result should belong to the people.

[-] piecat@lemmy.world 5 points 2 months ago

This is where "universal basic income" comes into play

[-] Aceticon@lemmy.dbzer0.com 3 points 2 months ago* (last edited 2 months ago)

More broadly, I would expect UBI to trigger a golden age of invention and artistic creation because a lot of people would love to spend their time just creating new stuff without the need to monetise it but can't under the current system, and even if a lot of that would be shit or crazily niche, the more people doing it and the freer they are to do it, the more really special and amazing stuff will be created.

load more comments (1 replies)
[-] Blackmist@feddit.uk 2 points 2 months ago

Unfortunately one will not lead to the other.

It will lead to the plot of Elysium.

load more comments (1 replies)
[-] DarkCloud@lemmy.world 3 points 2 months ago* (last edited 2 months ago)

Like Sam Altman who invests in Prospera, a private "Start-up City" in Honduras where the board of directors pick and choose which laws apply to them!

The switch to Techno-Feudalism is progressing far too much for my liking.

load more comments (7 replies)
load more comments
view more: next ›
this post was submitted on 23 Mar 2025
488 points (98.0% liked)

Technology

70853 readers
4000 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS