Bloody hell it's amazing how desperate this marketing attempt is. I've got an AI that'll blow your faces off with its output but I can't show you because, well, your face will resemble Gus Fring after several weeks in an acid bath. I can show you if you pay me a Huffmanian sum for API access though, but only if you sign an NDA and promise not to say mean things about us.
I can't believe convincing everyone your new tech may destroy civilization is how you build hype nowadays.
Well guess its a good way to attract sociopathic investors
Translation: Zuckerbot wants to generate some noise and there's nothing better to do that than AI doom and gloom.
Or have it regulated so much that free alternatives cannot exist.
Zuck invested billions in the wrong tech tree and it's desperate to start relevant.
Ah, but allow private access for wannabe authoritarians who will use it to create fake ad campaigns. No public access = no scrutiny from researchers nor watchdogs.
@sub_ @realcaseyrollins the tech out right now is pretty insane.
Luckily the memes have been pretty good
https://www.youtube.com/watch?v=59vYIzlxmF4
One thing I am hoping for with "AI" tech is to have better language teaching software. It would be crazy to have an AI teacher correct you mid conversation or be able to adapt to what you are struggling with.
Not really AI but if you want to learn lojban you get most of the way: The language actually has a formal grammar so it's possible to write bog-standard software that doesn't care when you say "You must have patience, my young Padawan" instead of "Patience you must have, my young Padawan". It's going to tell you which it expected but not beat you over the head with it.
That's a really cool idea!
“Our technology is soooo jaw droppingly powerful, we must warn the public!” It just seems a little self serving. ‘Critihype’ (Motherboard?) was a term I heard recently that sums this up nicely.
Translation... "This 'AI' is hot garbage that would make us look bad if we released it in its current state, but we want to hang out with the cool AI kids... So instead, we'll say that it's SO advanced and so good that it's DANGEROUS for the public. That way we look cool, ethical, and mysterious and definitely NOT that we blew our wad on the metaverse last year and are reaching for anything to climb out of the hole we dug..."
You can be unethical and still be legal; that's the way I live my life.
— Mark Zuckerberg
Mark won't hesitate to publish an unethical/dangerous model to the world, he'll actually thrive on it.
So sad that because of a number of bad people with bad intentions, such good pieces of tech are never given to people in it's entirety.
If meta ever releases this, I'm afraid it might be just a more "broken down" version of a local Google translate voice, unfortunately...
I doubt it's as good as they say. This is probably just hype to make their model seem really good in the news.
If they actually had it, I'm pretty sure they would release a demonstration video at least - that doesn't require sharing the data or the code, so I can't imagine why they wouldn't.
Buuuut nah. It's just marketing.
You can already make deepfake videos from a single target face image, and now you can make AI generated voice from two seconds of sample? The future is going to be interesting to say the least.
Hm, if it give Zuck the creeps, it must be good.
It’s only a matter of time. If they don’t release it, someone else will release something comparable.
Technology
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.