44
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 27 Jul 2025
44 points (100.0% liked)
Technology
74006 readers
704 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
The difference is that Google scans your private correspondence and can report you to authorities for any reason, legit or not.
That's a fair argument. Although I personally wouldn't put too much emphasis on "can report you to authorities for any reason". That's true of any third party, your local mini-mart can report you to the authorities for any reason, legit or not.
I am referring more to the Lumo LLM initiative. It's a standard LLM pitch with some privacy copytext added on.
While I haven't tried Lumo, I do have experience with smaller cloud LLMs (e.g. Mistral, trying to not use American services) and they tend to be subpar for my work use cases.
I don't see how Lumo will compete with ChatGPT or Gemini (haven't tried Grok for obvious reasons).
They literally sent police after some poor dude based on their correspondence with a doctor
https://www.eff.org/deeplinks/2022/08/googles-scans-private-photos-led-false-accusations-child-abuse
Google does not have the authority to "send the police". They reported content that looked like CSAM and the police did what police do and assumed the guy was a criminal.
The problem is not that they reported it, the problem is that they had it in the first place.