304

Google’s AI model will potentially listen in on all your phone calls — or at least ones it suspects are coming from a fraudster.

To protect the user’s privacy, the company says Gemini Nano operates locally, without connecting to the internet. “This protection all happens on-device, so your conversation stays private to you. We’ll share more about this opt-in feature later this year,” the company says.

“This is incredibly dangerous,” says Meredith Whittaker, the president of a foundation for the end-to-end encrypted messaging app Signal.

Whittaker —a former Google employee— argues that the entire premise of the anti-scam call feature poses a potential threat. That’s because Google could potentially program the same technology to scan for other keywords, like asking for access to abortion services.

“It lays the path for centralized, device-level client-side scanning,” she said in a post on Twitter/X. “From detecting 'scams' it's a short step to ‘detecting patterns commonly associated w/ seeking reproductive care’ or ‘commonly associated w/ providing LGBTQ resources' or ‘commonly associated with tech worker whistleblowing.’”

all 50 comments
sorted by: hot top controversial new old
[-] dukethorion@lemmy.world 51 points 6 months ago

"...locally on device without connecting to the internet"

How would it then report such behavior to Google, without internet?

If it notifies the end user, what good does that do? My phone is at my ear, I don't stop a conversation when another app sends a notification while I'm on a call.

This will 100% report things in the background to Google.

[-] TheHobbyist@lemmy.zip 22 points 6 months ago

You're putting a very large amount of trust on something which may simply require the flip of a switch to add the specified information to be sent back to Google along with all the regular heavy telemetry already feeding back...

[-] Rai@lemmy.dbzer0.com 7 points 6 months ago

Mega hot take on this site: I have no trust in Google

[-] GenderNeutralBro@lemmy.sdf.org 9 points 6 months ago

There are a few ways this could work, but it hardly seems worth the effort if it's not phoning home.

They could have an on-device database of red flags and use on-device voice recognition against that database. But then what? Pop up a "scam likely" screen while you're already mid-call? Maybe include an option to report scams back to Google with a transcript? I guess that could be useful.

Any more more than that would be a privacy nightmare. I don't want Google's AI deciding which of my conversations are private and which get sent back to Google. Any non-zero false positive rate would simply be unacceptable.

Maybe this is the first look at a new cat and mouse game: AI to detect AI-generated voices? AI-generated voice scams are already out there in the wild and will only become more common as time goes on.

[-] fruitycoder@sh.itjust.works 1 points 6 months ago

My bet is it will work like their federated text prediction in gboard.

[-] smeg@feddit.uk 1 points 6 months ago

I assume it means the "AI" bit is running locally (for cost/efficiency reasons and so your actual voice isn't uploaded) the results are then uploaded wherever (which is theoretically better but still hugely open to abuse)

[-] FMT99@lemmy.world 32 points 6 months ago

Blasting is all fine and good but they should've slammed them really.

[-] HootinNHollerin@lemmy.world 9 points 6 months ago
[-] dukethorion@lemmy.world 22 points 6 months ago

Let's talk about wiretapping laws and states where two-party consent is required to record a call.

Where I live, I must notify the other party that I am recording. If not, it's illegal. Also, any audio recorded without consent is not admissible in court.

[-] tmyakal@lemm.ee 8 points 6 months ago

"Not admissible because it was illegally captured" didn't give me the warm-and-fuzzies this comment sounds like it should've.

[-] veniasilente@lemm.ee 4 points 6 months ago

Complete tangent but what is two-party consent even for? I can imagine it gets in the way of getting a lot of evidence in cases of domestic abuse or organized crime.

Caller: [threatens me for 45 seconds] Me: "Could you call me again and repeat all that, for the recording?" Caller: [hangs]

[-] dukethorion@lemmy.world 2 points 6 months ago

Means both parties have to agree to be recorded (usually at the onset of a call).

"Be advised, this call is recorded for quality assurance purposes" at which point you could hang up. The 4th Amendment still applies in America, regardless of what local cops and prosecutors believe. You have a right to privacy in two-party conversations.

[-] veniasilente@lemm.ee 1 points 6 months ago

Oh, that makes more sense!

[-] phoenixz@lemmy.ca 1 points 6 months ago

not admissible in court

I get what you're saying there but that sounds very much like the kind of detail that s possible future Trump administration wouldn't much care about

[-] dukethorion@lemmy.world 1 points 6 months ago

Does not matter who is president of the US.

load more comments (6 replies)
[-] fubarx@lemmy.ml 15 points 6 months ago

One of the things they glided around was whether a lot of this on-device stuff needs a special processor chip with AI+security to work?

The Pixel phones (especially newer ones) made by Google have them, but the vast majority of Android phones don't.

So either these features only work on latest Google phones (which will piss off licensees and partners), or they're using plain old CPU/GPUs to do this sort of detection, in which case it will be sniffable by malicious third-parties.

And let's not forget that if the phone can listen to your conversation to detect malicious intent, any country can legally compel Google to provide them with the data by claiming it is part of a law-enforcement investigation.

Things are going to get spicy in Android-land.

[-] areyouevenreal@lemm.ee 1 points 6 months ago

It's not just google who have AI stuff built into their phones. All recent SoCs I have seen have had NPUs going back the last couple generations. A lot of older or cheap phones won't have one, but the new devices will.

I don't see the problem with using the phones normal GPU. This shouldn't be more insecure than making a call currently is. I am pretty sure android phones don't have a secure enclave just for making calls as you can give different apps access to calling features, and most calls I make are through third party apps anyway, not via POTS. That being said android is pretty secure anyway provided you don't give permissions to the wrong app. It's more secure than your average Linux system, as each app has its own user and is only allowed to access things it has explicit permissions to access. Secure enclaves aren't all that in my opinion.

And let's not forget that if the phone can listen to your conversation to detect malicious intent, any country can legally compel Google to provide them with the data by claiming it is part of a law-enforcement investigation.

The point of doing it locally is the audio never gets sent to google directly. That being said they could definitely do some dodgy things by training the ML model to search for words like abortion, drugs, transgender, etc depending on what the laws are in the country the phone is being used in.

[-] fruitycoder@sh.itjust.works 11 points 6 months ago

As a OpenSource app with no need for centralized server it would be great. I want that. As spyware configured out of my control absolutely not.

[-] GenosseFlosse@lemmy.nz 3 points 6 months ago

I think you do need some central server, and then check if a lot of users report certain number blocks for spam in a short amount of time. No need for AI on this one. Isn't that how most phone spam blockers work?

[-] fruitycoder@sh.itjust.works 2 points 6 months ago

I have that now, its OK. But spammers cycle through numbers so quick that good amount get through

[-] sherlockholmez@lemmy.ml 1 points 6 months ago

That is how it already works. I don't think most people have as much of a problem with that as complete client side screening.

[-] elrik@lemmy.world 10 points 6 months ago

It might be a good feature for the elderly as long as it's local and optionally enabled (especially if it can be enabled only for unknown callers).

Yes, I understand you would never really know if it's not always enabled. But then again, you currently don't know if anything similar isn't already enabled.

For other users, again potentially useful if it's opt in. However, many people (myself included) simply don't answer the phone anymore unless it's a caller we already know. I use Google's call screening feature for any other caller not in my contact list already, and I would estimate about 1 in 20 or 5% of such calls I receive aren't spam (marketing or fraud). Of those non-spam calls, the majority are appointment reminders I don't need.

So would I turn this feature on? No, I don't have a need. Could it be beneficial for the elderly? Yes, but probably not implemented in a way where it would actually be effective.

[-] theparadox@lemmy.world 3 points 6 months ago

Honestly, if it was transparent enough, I'd be fine with a service you could turn on that would listen to the first x seconds/minutes of a phone call to try and detect an AI generated voice. I have family that would 100% wire thousands of dollars if someone with their kid's voice asked them to.

[-] beeng@discuss.tchncs.de 1 points 6 months ago* (last edited 6 months ago)

Think of the ~~children~~ elderly!

[-] elrik@lemmy.world 1 points 6 months ago

I mean, yeah, exactly. Keep in mind scammers are targeting vulnerable people. Granted I don't see how such a feature will work on my grandmother's flip phone.

load more comments (1 replies)
[-] LodeMike@lemmy.today 8 points 6 months ago
[-] TWeaK@lemm.ee 1 points 6 months ago

I feel like this would require cooperation from the manufacturer, as Google doesn't actually provide the Phone app (except when they are the manufacturer).

[-] noodlejetski@lemm.ee 4 points 6 months ago* (last edited 6 months ago)

except that it does. some OEMs like Samsung serve their own (or at least Samsung used to; not sure if it's still true), but it's definitely available on non-Google-branded phones.

[-] TWeaK@lemm.ee 1 points 6 months ago

It's available on all phones, but they all have their own version, forked from long ago. Even the standard AOSP Phone app has long split from Google (who have ceased open source development of the app).

[-] noodlejetski@lemm.ee 1 points 6 months ago

I'm well aware that it's available to install on all phones, but I'm also fairly sure that other OEMs do use it as the default dialer, too. I saw it preinstalled on one of my mom's phones, either her current midrange Samsung, or her previous Xiaomi.

[-] Ajen@sh.itjust.works 1 points 6 months ago

They might be taking about Google Fi.

[-] smeg@feddit.uk 1 points 6 months ago

They seem to be offering a lot of this sort of thing as a value-add for buying a Pixel

this post was submitted on 16 May 2024
304 points (100.0% liked)

Privacy Guides

16263 readers
51 users here now

In the digital age, protecting your personal information might seem like an impossible task. We’re here to help.

This is a community for sharing news about privacy, posting information about cool privacy tools and services, and getting advice about your privacy journey.


You can subscribe to this community from any Kbin or Lemmy instance:

Learn more...


Check out our website at privacyguides.org before asking your questions here. We've tried answering the common questions and recommendations there!

Want to get involved? The website is open-source on GitHub, and your help would be appreciated!


This community is the "official" Privacy Guides community on Lemmy, which can be verified here. Other "Privacy Guides" communities on other Lemmy servers are not moderated by this team or associated with the website.


Moderation Rules:

  1. We prefer posting about open-source software whenever possible.
  2. This is not the place for self-promotion if you are not listed on privacyguides.org. If you want to be listed, make a suggestion on our forum first.
  3. No soliciting engagement: Don't ask for upvotes, follows, etc.
  4. Surveys, Fundraising, and Petitions must be pre-approved by the mod team.
  5. Be civil, no violence, hate speech. Assume people here are posting in good faith.
  6. Don't repost topics which have already been covered here.
  7. News posts must be related to privacy and security, and your post title must match the article headline exactly. Do not editorialize titles, you can post your opinions in the post body or a comment.
  8. Memes/images/video posts that could be summarized as text explanations should not be posted. Infographics and conference talks from reputable sources are acceptable.
  9. No help vampires: This is not a tech support subreddit, don't abuse our community's willingness to help. Questions related to privacy, security or privacy/security related software and their configurations are acceptable.
  10. No misinformation: Extraordinary claims must be matched with evidence.
  11. Do not post about VPNs or cryptocurrencies which are not listed on privacyguides.org. See Rule 2 for info on adding new recommendations to the website.
  12. General guides or software lists are not permitted. Original sources and research about specific topics are allowed as long as they are high quality and factual. We are not providing a platform for poorly-vetted, out-of-date or conflicting recommendations.

Additional Resources:

founded 1 year ago
MODERATORS