59

BDSM, LGBTQ+, and sugar dating apps have been found exposing users' private images, with some of them even leaking photos shared in private messages.

top 36 comments
sorted by: hot top controversial new old
[-] balder1991@lemmy.world 21 points 1 week ago

Brace yourselves, because this is only going to get worse with the current “vibe coding” trend.

[-] CheeseToastie@lazysoci.al 4 points 1 week ago
[-] Vendetta9076@sh.itjust.works 16 points 1 week ago

Vibe coding is the current trend of having an LLM build your codebase for you then shipping it without attempting to understand the codebase.

Most developers are using LLMS to some extent to speed up their coding, as cursor and Claude are really good at removing toil. But vibe coders have the LLM build the entire thing and don't even know how it works.

[-] ElectroVagrant@lemmy.world 16 points 1 week ago

In other words, vibe coders are today's technologically accelerated script kiddie.

That's arguably worse as the produced scripts may largely work and come with even less demand for understanding than a script kid's cobbling together of code may have demanded.

[-] Vendetta9076@sh.itjust.works 1 points 1 week ago

100% accurate.

[-] CheeseToastie@lazysoci.al 2 points 1 week ago
[-] TeddE@lemmy.world 8 points 1 week ago

Large language models (LLM) are the product of neural networks, a relatively recent innovation in the field of computer intelligence.

Since these systems are surprisingly adept at producing natural sounding language, and is good at create answers that sound correct (and sometimes actually happen to be) marketers have seized on this as an innovation, called it AI (a term with a complicated history), and have started slapping it onto every product.

[-] CheeseToastie@lazysoci.al 2 points 1 week ago

Ahhhhhh... that's a really simple explanation thanks

[-] qaz@lemmy.world 2 points 1 week ago

...neural networks, a relatively recent innovation in the field of computer intelligence.

Neural networks have been around for quite some time. The simplest forms of it have actually existed since around 1795.

Basically, think ChatGPT

[-] qyron@sopuli.xyz 1 points 1 week ago* (last edited 1 week ago)

Large Language Model

To the extent of my understanding, it is a form of slightly more sophisticated bot, as in an automated response algorithm, that is developed over a set of data, in order to have it "understand" the mechanics that make such set cohesive to us humans.

With such background, it is supposed to produce new similar outputs if given new raw data sets to run through the mechanics it acquired during development.

[-] qaz@lemmy.world 1 points 1 week ago* (last edited 1 week ago)

A machine learning model that can generate text.

It works by converting pieces of text to "tokens" which are mapped to numbers in a way that reflects their association with other pieces of text. The model is fed input tokens and predicts tokens based on that, which are then converted to text.

What is toil in this context?

[-] spooky2092@lemmy.blahaj.zone 2 points 1 week ago

Boring/repetitive work. For example, I regularly use an AI coding assistant to block our basic loop templates with variables filled in, or have it quickly finish the multiple case statements or assigning values to an object with a bunch of properties.

In little things like that, it's great. But once you get past a medium sized function, it goes off the rails. I've had it make up parameters in stock library functions based on what I asked it for.

[-] msage@programming.dev 0 points 1 week ago

So we are moving away from >1GB node_modules finally? Or is it too soon?

[-] FourWaveforms@lemm.ee 1 points 1 week ago

I love feeding my bloated node_modules

[-] Little8Lost@lemmy.world 0 points 1 week ago

Its going to be 1GB node_modules handled by garbage ai code
ai is only good at doing smaller scripts but loosing connections and understandment in larger codebases, combined with people who cant program well (i mean not only coding but debugging... as well) also called vibe programmers its going to be a mess

if a product claims it has vibecoding: find an alternative!

[-] msage@programming.dev 4 points 1 week ago

I'm losing my will to live lately at an alarming rate.

I used to love IT, way back at the start of 00s.

Soon after the 10s started, I noticed bullshit trends replacing one another... like crypto or clouds or SaaS... but now with the AI I just feel alienated. Like we're just all going to hell, and I hate the first row seating.

[-] balder1991@lemmy.world 2 points 1 week ago

At this point, I think it’s required to have a sort of alternate identity online and keeping anything private, photos of yourself and other information just offline. Except for government stuff, which requires your real identity.

[-] msage@programming.dev 2 points 1 week ago

I mean yeah, I selfhost everything, but I hate that i have to learn and support the most useless shit ever just to earn a living.

It used to be fun being a dev, now I'm just repeating the same warning phrases about technologies.

[-] MissGutsy@lemmy.blahaj.zone 11 points 1 week ago* (last edited 1 week ago)

Cybernews researchers have found that BDSM People, CHICA, TRANSLOVE, PINK, and BRISH apps had publicly accessible secrets published together with the apps’ code.

All of the affected apps are developed by M.A.D Mobile Apps Developers Limited. Their identical architecture explains why the same type of sensitive data was exposed.

What secrets were leaked?

  • API Key
  • Client ID
  • Google App ID
  • Project ID
  • Reversed Client ID
  • Storage Bucket
  • GAD Application Identifier
  • Database URL

[...] threat actors can easily abuse them to gain access to systems. In this case, the most dangerous of leaked secrets granted access to user photos located in Google Cloud Storage buckets, which had no passwords set up.

In total, nearly 1.5 million user-uploaded images, including profile photos, public posts, profile verification images, photos removed for rule violations, and private photos sent through direct messages, were left publicly accessible to anyone.

So the devs were inexperienced in secure architectures and put a bunch of stuff on the client which should probably have been on the server side. This leaves anyone open to just use their API to access every picture they have on their servers. They then made multiple dating apps with this faulty infrastructure by copy-pasting it everywhere.

I hope they are registered in a country with strong data privacy laws, so they have to feel the consequences of their mismanagement

[-] MonkderVierte@lemmy.ml 5 points 1 week ago

Inexperienced? This is not-giving-a-fuck level.

No, it's lack of experience. When I was a junior dev, I had a hard enough time understanding how things worked, much less understanding how they could be compromised by an attacker.

Junior devs need senior devs to learn that kind of stuff.

[-] PumaStoleMyBluff@lemmy.world 2 points 1 week ago

It does help if services that generate or store secrets and keys display a large warning that they should be kept secret, every time they're viewed, no matter the experience level of the viewer. But yeah understanding why and how isn't something that should be assumed for new devs.

[-] azalty@jlai.lu 4 points 1 week ago

The illusion of choice

A lot of "normal" dating apps are also owned by the same companies

[-] taiyang@lemmy.world 3 points 1 week ago

I've met the type who run businesses like that, and they likely do deserve punishment for it. My own experience involved someone running gray legality betting apps, and the owner was a cheapskate who got unpaid interns and filipino outsourced work to build their app. Guy didn't even pay 'em sometimes.

Granted, you could also hire inexperienced people if you're a good person with no financial investor, but that I've mostly seen with education apps and other low profit endeavors. Sex stuff definitely is someone trying to score cash.

[-] Flax_vert@feddit.uk 0 points 1 week ago

Do you reckon this app could have been vibecoded/a product of AI? Or massive use of AI in development? I'd know not to do this as a teenager when I was beginning to tinker with making apps, nevermind an actual business.

[-] taladar@sh.itjust.works 2 points 1 week ago

I know for a fact that a lot of applications made these mistakes before AI was around so while AI is a possibility it is absolutely not necessary.

[-] yoshman@lemmy.world 1 points 1 week ago

I had a test engineer demand an admin password be admin/admin in production. I said absolutely not and had one of my team members change it to a 64-character password generated in a password manager. Dumbass immediately logs in and changes it to admin again. We found out when part of the pipeline broke.

So, we generated another new one, and he immediately changed it back to admin again. We were waiting for it the second time and immediately called him out on the next stand-up. He said he needs it to be admin so he doesn't have to change his scripts. picard_facepalm.jpg

[-] Serinus@lemmy.world 2 points 1 week ago

How is he not fired? Incompetence and ignorance is one thing, but when you combine it with effectively insubordination... well, you better be right. And he is not.

[-] yoshman@lemmy.world 1 points 1 week ago

He was a subcontractor, so technically, he's not our employee.

I bubbled it up the chain on our side, and it hasn't happened since.

[-] PumaStoleMyBluff@lemmy.world 10 points 1 week ago

Anyone who uses Grindr, please be aware that any photos you send are cached and stored unencrypted in plain old folders on the receiver's phone, regardless of whether they were expiring or in an album that you later revoked. It's nearly trivial to grab any photo someone sends you, with no watermark or screenshot notification.

[-] CheeseToastie@lazysoci.al 7 points 1 week ago

This is devastating. The LGBT community are often hiding their true selves because of family, colleagues, culture etc. People will be destroyed.

[-] azalty@jlai.lu 3 points 1 week ago* (last edited 1 week ago)

Use Signal or SimpleX for more private stuff like this 👀

[-] CluckN@lemmy.world 3 points 1 week ago

I wonder how many conservative politicians they’ll find.

[-] thatradomguy@lemmy.world -3 points 1 week ago

Just don't send nudes.... why do people think other people won't figure out how to screenshot or just keep photos forever? Even if you trust the person, the person could get hacked.... the pwned guy got pwned for Jehova's sake. Just stop sending that ~~shit~~.

this post was submitted on 30 Mar 2025
59 points (100.0% liked)

Technology

68496 readers
2381 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS