214
you are viewing a single comment's thread
view the rest of the comments

Okay, well, if everyone had access to an AGI, anyone could design and distribute a pathogen that could wipe out a significant portion of the population. Then again, you'd have the collective force of everyone else's AI countering that plot.

I think that putting that kind of power into the hands of everyone shouldnt be done lightly.

[-] Hanabie@sh.itjust.works 7 points 1 year ago

There are papers online on how to design viruses. Now to get funding for a lab and staff, because this is nothing like Breaking Bad.

[-] Rayspekt@kbin.social 6 points 1 year ago

You still can't manufacture it. Your comparision with nukes is actually a good example: The basic knowledge how a nuke works is out there, yet most people struggle in refining weapon-grade plutonium.

Knowledge is only one part in doing something.

[-] lily33@lemm.ee 5 points 1 year ago

I would say the risk of having AI be limited to the ruling elite is worse, though - because there wouldn't be everyone else's AI to counter them.

And if AI is limited to a few, those few WILL become the new ruling elite.

[-] Touching_Grass@lemmy.world 6 points 1 year ago

And people would be less likely to identify what AI can and can't do if we convince ourselves to limit our access to it.

[-] subignition@kbin.social 1 points 1 year ago

People are already incompetent enough at this when there's a disclaimer in front of their faces warning about gpt.

We're seeing responses even in this thread conflating AGI with LLMs. People at large are too fucking stupid to be trusted with this kind of thing

[-] Touching_Grass@lemmy.world 3 points 1 year ago

Are we back to freaking out about the anarchists cookbook

[-] Honytawk@lemmy.zip 3 points 1 year ago

Since when does AI translate to being able to create bacteria and stuff?

If having the information on how to do so was enough to create pathogens, we should already have been wiped out because of books and libraries.

[-] photonic_sorcerer@lemmy.dbzer0.com 0 points 1 year ago* (last edited 1 year ago)

You can't type "How do I make a pathogen to wipe out a city" into a book. A sufficiently advanced and aligned AI will, however, answer that question with a detailed list of production steps, resource requirements and timeline.

[-] Touching_Grass@lemmy.world 5 points 1 year ago

Oog, what if by making this fire, it burns down the forest?

[-] LibertyLizard@slrpnk.net 2 points 1 year ago

Well that did happen to be fair.

[-] Kichae@kbin.social 4 points 1 year ago

Right. So, the actual danger here is... Search engines?

[-] snooggums@kbin.social 0 points 1 year ago

Have you heard about this thing called the internet?

[-] HubertManne@kbin.social 0 points 1 year ago

this requires special materials like enzymes and such. It would much easier to restrict access to those. Now true this godlike ai could go back to show you how to make all the base stuff but you need equipment for this like centrifuges and you will need special media. Its like the ai telling you how to make a nuke really. Yeah it could star you off with bronze age metal smithing and you could work your way up to the modern materials you would need but realistically you won't be able to do it (assuming again you restrict certain materials)

[-] serratur@lemmy.wtf 2 points 1 year ago

You're just gonna print the pathogens with the pathogen printer? You understand that getting the information doesn't mean you're able to produce it.

[-] Touching_Grass@lemmy.world 4 points 1 year ago

I need an article on how a 3d printer can be used to print an underground chemistry lab to produce these weapons grade pathogens

That's the thing though: a sufficiently advanced intelligence will know how. You don't have to.

[-] testfactor@lemmy.world 2 points 1 year ago

I know how to build a barn. Doesn't mean I can do it by myself with no tools or materials.

Turns out that building and operating a lab that can churn out bespoke pathogens is actually even more difficult and expensive than that.

[-] bioemerl@kbin.social 1 points 1 year ago

Your brain is an (NA)GI

[-] Kichae@kbin.social 0 points 1 year ago

Let's assume your hypothetical here isnt bonkers: How, exactly, do you propose limiting people's access to linear algebra?

this post was submitted on 21 Sep 2023
214 points (92.5% liked)

Technology

59598 readers
3061 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS