203
top 50 comments
sorted by: hot top controversial new old

It’s pretty terrifying when you think about the possibilities of deception. And also how throwaway content is going to become. We are going to generate content at a volume orders of magnitude larger than our already current excessive volume, and finding the stuff that has real meaning and a real message is going to be even harder.

Also, artists whose work and styles fed this will be put out of business without ever being paid for their work that was used to train these models. 🫤

[-] lolcatnip@reddthat.com 32 points 8 months ago

I dream of a world where nobody has a job they have to do for money.

load more comments (24 replies)
[-] wrekone@lemmyf.uk 22 points 8 months ago

When I was a kid, I had seen, or at least heard of, nearly every TV show from my parent's generation. Going back probably 40 years. Like, I've probably seen every Looney Tunes, every episode of M.A.S.H., and most episodes of The Munsters, because some days there wasn't anything else to watch. My kids look at me crazy if I haven't heard of the latest flash-in-the-pan influencer, but if I bring up a 10-year old movie or TV show, they have no idea what I'm talking about.

[-] evranch@lemmy.ca 5 points 8 months ago

I miss the shared culture that broadcast TV and radio gave us. Is the selection today better, with more, higher quality content? Definitely.

But all of us Millenials can quote Simpsons at each other all day even if we've never met. South park, Futurama, King of the Hill, James Bond and other corny action movies. We all saw them so many times, because that's what was on.

That shared culture is worth more than the content actually being good, IMO. Half the time now someone will ask if you've seen a show and you haven't ever heard of it.

[-] aniki@lemm.ee 17 points 8 months ago

you raise a crazy good point - the amount of data youtube generates is staggering and that includes a high barrier to entry. if sora allows anyone to just cut shit and upload it, we're going to outpace the rate at which data-free hardware is manufactured.

[-] devfuuu@lemmy.world 8 points 8 months ago

And we will be stuck in a loop of type of art and culture that is a ouruborus feeding itself without new styles or genuine new art being fed after artists not being recognized and payed and not wanting to give more content to the machine. That dark ages are upon us and we are all singing it's praise.

[-] AbouBenAdhem@lemmy.world 3 points 8 months ago* (last edited 8 months ago)

We are going to generate content at a volume orders of magnitude larger than our already current excessive volume, and finding the stuff that has real meaning and a real message is going to be even harder.

It could go both ways: similar software could “compress” video (especially AI-generated video) into text prompts that could then re-create it without needing to store it. (Currently, of course, the processing cost would be higher than the storage cost for the raw video—but the scenario in which we’re cranking out excessive amounts of AI-generated content implies that the high processing costs have been eliminated.) That would also have the side effect of making it easier to find and organize videos based on their “meaning”.

load more comments (1 replies)
load more comments (26 replies)
[-] Varyk@sh.itjust.works 66 points 8 months ago

Okay so who has the updated Will Smith's spaghetti video?

[-] KingThrillgore@lemmy.ml 46 points 8 months ago

I hate this.

[-] danielfgom@lemmy.world 40 points 8 months ago

Instead of using robots to replace menial jobs and help humans who have physical labour jobs, they've invented a tool that will get rid of all white collar jobs, forcing us all into manual, low paid labour jobs.

Taxes will fall off a cliff and life will get really bad because the state won't have money to maintain the country. Companies making Ai content won't be able to sell it because no one can has money to buy it. In general all product sales will fall off a cliff, except for food, and many companies will close, resulting in mass unemployment and eventually collapse of society .....

Great job morons!

[-] realharo@lemm.ee 12 points 8 months ago* (last edited 8 months ago)

If AI gets really good, manual labor automation won't be far behind, as the AI itself will be applied to robotics and AI research.

The only thing of value left will be natural resources.

[-] danielfgom@lemmy.world 7 points 8 months ago

Sounds like good motivation for the machines to kill us off and keep the resources for themselves

[-] TwilightVulpine@lemmy.world 12 points 8 months ago

More like, a motivation for the wealthy who control the machines to kill us off.

AI sentience is still science fiction but AI-powered corporate exploitation is very real, right now.

[-] realharo@lemm.ee 5 points 8 months ago* (last edited 8 months ago)

That's assuming they have that goal. The goal of survival and reproduction exists because of natural selection (those that don't have that goal simply don't make it into the next generation, when competing against those that do).

But that doesn't necessarily apply to AI systems. At least while humans have a say in which systems survive and get developed further, and which ones get scrapped. When humans control the resources, the best way to get a sizable allocation of them is by being useful to humans (or at least making them believe that).

[-] butterflyattack@lemmy.world 4 points 8 months ago

Happily my job is so shit and poorly paid that I don't anticipate it ever being worth automating. Sometimes humans are just cheaper.

[-] gapbetweenus@feddit.de 12 points 8 months ago

forcing us all into manual, low paid labour jobs.

Maybe we should have shown some solidarity with people in those jobs and fought for them to get paid better?

[-] willington@lemmy.dbzer0.com 9 points 8 months ago

There's always money/wealth in the economy. If the workers don't have it, someone else does. Find where the money is, and tax it. Then redistribute.

It's not a hard concept. It's a question of the political will. We know what to do, but will we do it?

We already do know where the wealth is and we aren’t taxing it. I think we know the answer to that question. Systems are only still functioning because there’s a dribble of tax revenue that still comes in. But we are already seeing schools lose funding and roads crumble as tax revenue hasn’t grown as fast as costs or populations. I don’t think it’s going to get better, because you have to be rich or have rich allies to get elected, so I don’t know how we could create different tax laws.

load more comments (1 replies)
[-] GiddyGap@lemm.ee 7 points 8 months ago

Came here to doom-scroll. I was not disappointed.

[-] Toneswirly@lemmy.world 6 points 8 months ago

dont worry Bro, they're gonna replace the low paid jobs too.

[-] Potatos_are_not_friends@lemmy.world 25 points 8 months ago

Can we get a tldr? Can't watch a video.

[-] abhibeckert@lemmy.world 24 points 8 months ago* (last edited 8 months ago)

TLDR: a year ago AI video was garbage. Today it’s almost as good as one that would cost a few hundred thousand dollars to pay a human production team to make (according to someone who’s professional work is creating those videos).

It’s not quite there - hands glitch out occasionally. Sometimes animation doesn’t quite line up right (e.g. walking might skip a step) but it’s 99% there and and the improvements over the last 12 months are astounding. That last 1% surely won’t take long to close.

There was a landscape drone video from a helicopter that looked absolutely real.

Note this is not publicly available yet - OpenAI said they are still working on safety features to reduce the risk of it being used to create content that they want no part in.

[-] squirrel@discuss.tchncs.de 23 points 8 months ago

I've asked Gemini for a summary and it's pretty spot on:

This video is about AI generated videos and how they have become very realistic.

The speaker, Marques Brownlee, discusses a new AI model called Sora that can generate videos from text input. He shows examples of videos generated by Sora, including one of a woman walking down a Tokyo street, a car driving up a mountain road, and a litter of puppies playing in the snow. He points out that these videos are still not perfect, but they are much better than what was possible just a year ago.

He discusses the implications of this technology, both good and bad. On the one hand, it could be used to create fake videos that could be used to deceive people. On the other hand, it could be used to create stock footage that is more affordable and accessible than ever before. Brownlee concludes by saying that this technology is still in its early stages, but it has the potential to change the world in many ways.

[-] demonsword@lemmy.world 7 points 8 months ago

I’ve asked Gemini for a summary

man you've post the video and couldn't even summarize it yourself? talk about laziness huh

[-] CaffeinatedMoth@lemm.ee 19 points 8 months ago

Let's see. Spend several minutes composing a few paragraphs, followed by revising because of errors in composition, spelling,or grammar...or simply spend a few seconds with AI. Work smarter not harder.

load more comments (2 replies)
load more comments (1 replies)

That's amazing. I didn't know AI can do that. Going to start doing that from now on!

[-] aniki@lemm.ee 15 points 8 months ago

My jaw is on the floor. It makes typing very difficult.

[-] Reverendender@sh.itjust.works 11 points 8 months ago

When do we get to use this? I don't know what a "Red Team Member" is, but I pay a monthly membership.

[-] kinsnik@lemmy.world 33 points 8 months ago* (last edited 8 months ago)

Red Team is a hacking term that refers to people who try to sabotage or use the system to create harmful content, as a way to test and discover problems before it is usable by any external users.

https://en.wikipedia.org/wiki/Red_team

[-] PlexSheep@feddit.de 8 points 8 months ago

That's inaccurate. Red Team is the guys that test your security from an attacker view point. Red Teams are often contractors hired by companies. The companies are the ones paying to be "hacked", so they can fix whatever gaping security holes the red Team finds.

At least, that's usually the definition. If just talking about AI stuff, I'd call those people testers.

load more comments (2 replies)
[-] werefreeatlast@lemmy.world 10 points 8 months ago

Now I can be in the Simpsons! Everyone in my front yard security camera can be in the Simpsons 😀!

[-] KairuByte@lemmy.dbzer0.com 4 points 8 months ago* (last edited 8 months ago)

“Sir, I understand you’re trying to be helpful, but I assure you the background characters from the symptoms did not rob you.

load more comments
view more: next ›
this post was submitted on 16 Feb 2024
203 points (83.1% liked)

Technology

59080 readers
3263 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS