204
submitted 1 week ago* (last edited 6 days ago) by Paddy66@lemmy.ml to c/privacy@lemmy.ml

This is an open question on how to get the masses to care...

Unfortunately, if other people don't protect their privacy it affects those who do, because we're all connected (e.g. other family members, friends). So it presents a problem of how do you get people who don't care, to care?

I started the Rebel Tech Alliance nonprofit to try to help with this, but we're still really struggling to convert people who have never thought about this.

(BTW you might need to refresh our website a few times to get it to load - no idea why... It does have an SSL cert!)

So I hope we can have a useful discussion here - privacy is a team sport, how do we get more people to play?

top 50 comments
sorted by: hot top controversial new old
[-] callmenoodles@lemmy.ml 1 points 2 days ago

I've noticed many people tend to look for alternatives when their mainstream apps are either temporarily down or become greedy.

I remember a few years ago Meta servers were down which resulted in my whole family and some friends at least partially moving over to Signal. Now it's important that the alternative has at least the basic features people want. Most people are not ubernerds like us willing to sacrafice GIFs, emoji's or whatever and would switch back once they realize it's missing features.

For instance, I've noticed people becoming increasingly frustrated with Windows but won't switch to Linux due to missing program or game support.

So ultimately I think the focus should be for privacy-respecting apps to be feature-complete. It's much easier to convince someone to switch if there's a reason to stay.

This probably means sacrificing on security features but I don't think the goal should be for everyone to be on Qubes OS and SimpleX. Rather having at least basic online privacy and the ability to remove data on demand.

[-] MangoPenguin@lemmy.blahaj.zone 3 points 4 days ago

I think making it as easy and feature packed as the big commercial apps and services would go a long way.

Right now asking someone to switch to a more private service/app is not only the work of switching over, but also learning an often much more complex system.

[-] tomatolung@sopuli.xyz 6 points 6 days ago

Great cause and one that reaches to the heart of what I see as impacting much of the governmental and societal disruption that's happening. It's a complex and nuanced issue that is likely to take multiple prongs and a long time to resolve.

Let me start by again generally agreeing with the point. Privacy is necessary for reasons beyond the obvious needs. Speaking to the choir here on a privacy community. I think it's worth listing the reasons that I understand why Americans are generally dismissive of the need for privacy protections. I cheated here, and used an LLM to help, but I think these points are indicative of things to overcome.

  • Convenience > confidentiality. Nearly half of U.S. adults (47 %) say it’s acceptable for retailers to track every purchase in exchange for loyalty-card discounts, illustrating a widespread “deal first, data later” mindset. Pew Research Center

  • “Nothing to hide.” A popular refrain equates privacy with secrecy; if you’re law-abiding, the thinking goes, surveillance is harmless. The slogan is so common that rights groups still publish rebuttals to it. Amnesty International

  • Resignation and powerlessness. About 73 % feel they have little or no control over what companies do with their data, and 79 % say the same about government use—attitudes that breed fatalism rather than action. Pew Research Center

  • Policy-fatigue & click-through consent. Because privacy policies are dense and technical, 56 % of Americans routinely click “agree” without reading, while 69 % treat the notice as a hurdle to get past, not a safeguard. Pew Research Center

  • The privacy paradox. Behavioral studies keep finding a gap between high stated concern and lax real-world practice, driven by cognitive biases and social desirability effects. SAGE Journals

  • Market ideology & the “free-service” bargain. The U.S. tech economy normalizes “free” platforms funded by targeted ads; many users see data sharing as the implicit cost of innovation and participation. LinkedIn

  • Security framing. Post-9/11 narratives cast surveillance as a safety tool; even today 42 % still approve of bulk data collection for anti-terrorism, muting opposition to broader privacy safeguards. Pew Research Center

  • Harms feel abstract. People worry about privacy in the abstract, yet most haven’t suffered visible damage, so the risk seems remote compared with daily conveniences. IAPP

  • Patchwork laws. With no single federal statute, Americans face a confusing mix of state and sector rules, making privacy protections feel inconsistent and easy to ignore. Practice Guides

  • Generational normalization. Digital natives are more comfortable with surveillance; a 2023 survey found that 29 % of Gen Z would even accept in-home government cameras to curb crime. cato.org

Having listed elements to overcome, it's easy to see why this feels sisyphean task in an American society. (It is similar, but different other Global North societies. The US desperately needs change as is evident with the current administration.) Getting to your question though, I feel like the real rational points to convey are not those above, but the reasons how a lack of privacy impacts individuals.

  • Political micro-targeting & democratic drift
    Platforms mine psychographic data to serve bespoke campaign messages that exploit confirmation bias, social-proof heuristics, and loss-aversion—leaving voters receptive to turnout-suppression or “vote-against-self-interest” nudges. A 2025 study found personality-tailored ads stayed significantly more persuasive than generic ones even when users were warned they were being targeted. Nature

  • Surveillance pricing & impulsive consumption
    Retailers and service-providers now run “surveillance pricing” engines that fine-tune what you see—and what it costs—based on location, device, credit profile, and browsing history. By pairing granular data with scarcity cues and anchoring, these systems push consumers toward higher-priced or unnecessary purchases while dulling price-comparison instincts. Federal Trade Commission

  • Dark-pattern commerce & hidden fees
    Interface tricks (pre-ticked boxes, countdown timers, labyrinthine unsubscribe flows) leverage present-bias and choice overload, trapping users in subscriptions or coaxing them to reveal more data than intended. Federal Trade Commission

  • Youth mental-health spiral
    Algorithmic feeds intensify social-comparison and negativity biases; among U.S. teen girls, 57 % felt “persistently sad or hopeless” and nearly 1 in 3 considered suicide in 2021—a decade-high that public-health experts link in part to round-the-clock, data-driven social media exposure. CDC

  • Chilling effects on knowledge, speech, and creativity
    After the Snowden leaks, measurable drops in searches and Wikipedia visits for sensitive topics illustrated how surveillance primes availability and fear biases, nudging citizens away from inquiry or dissent. Common Dreams

  • Algorithmic discrimination & structural inequity
    Predictive-policing models recycle historically biased crime data (representativeness bias), steering patrols back to the same neighborhoods; credit-scoring and lending algorithms charge Black and Latinx borrowers higher interest (statistical discrimination), entrenching wealth gaps. American Bar AssociationRobert F. Kennedy Human Rights

  • Personal-safety threats from data brokerage
    Brokers sell address histories, phone numbers, and real-time location snapshots; abusers can buy dossiers on domestic-violence survivors within minutes, exploiting the “search costs” gap between seeker and subject. EPIC

  • Identity theft & downstream financial harm
    With 1.35 billion breach notices issued in 2024 alone, stolen data fuels phishing, tax-refund fraud, bogus credit-card openings, and years of credit-score damage—costs that disproportionately hit low-information or low-income households. ITRC

  • Public-health manipulation & misinformation loops
    Health conspiracies spread via engagement-optimized feeds that exploit negativity and emotional-salience biases; a 2023 analysis of Facebook found antivaccine content became more politically polarized and visible after the platform’s cleanup efforts, undercutting risk-perception and vaccination decisions. PMC

  • Erosion of autonomy through behavioral “nudging”
    Recommendation engines continuously A/B-test content against your micro-profile, capitalizing on novelty-seeking and variable-reward loops (think endless scroll or autoplay). Over time, the platform—rather than the user—decides how hours and attention are spent, narrowing genuine choice. Nature

  • National-security & geopolitical leverage
    Bulk personal and geolocation data flowing to data-hungry foreign adversaries opens doors to espionage, blackmail, and influence operations—risks so acute that the DOJ’s 2025 Data Security Program now restricts many cross-border “covered data transactions.” Department of Justice

  • Social trust & civic cohesion
    When 77 % of Americans say they lack faith in social-media CEOs to handle data responsibly, the result is widespread mistrust—not just of tech firms but of institutions and one another—fueling polarization and disengagement. Pew Research Center

[-] tomatolung@sopuli.xyz 3 points 6 days ago

And one last point here, is that these all stem from the way we as humans are built. Although we are capable of rational though, we often do not make rational decisions. Indeed those decisions are based on cognitive biases which we all have and are effected by context, environment, input, etc. It's possible to overcome this lack of rational judgement, through processes and synthesis such as the scientific method. So we as citizens and humans can build institutions that help us account for the individual biases we have and overcome these biological challenges, while also enjoying the benefits and remaining human.

[-] Jason2357@lemmy.ca 4 points 6 days ago

As a thought experiment: what would have happened if instead of a public health regulation approach, we dealt with restaurant safety by providing a few safe places and advocating everyone go there if they don’t want salmonella or e-coli poisoning. We’d have people ignorant going to the dangerous places, others misinformed or in denial, and a flood of misinformation that food poisoning is either “fine” or there’s no avoiding it anyway so best not to worry.

[-] Paddy66@lemmy.ml 3 points 6 days ago

Interesting!

And then Fuckerberg would gaslight us by declaring that "public health is dead"

[-] Jason2357@lemmy.ca 1 points 5 days ago

Yes, all while he’d have a private chef and a staff that keep him safe.

[-] autonomoususer@lemmy.world 2 points 5 days ago* (last edited 5 days ago)

@Paddy66@lemmy.ml

Another wall of text no one will ever read does nothing. Do this: https://lemmy.world/post/21620691 https://lemmy.world/post/20950542

[-] Kobo@sh.itjust.works 1 points 6 days ago

Anyone want to join my privacy team? I'm trying out for the 2026 Olympics.

[-] pandorabox@lemmy.world 2 points 1 day ago

Same brooo🤣🤣

[-] DrunkAnRoot@sh.itjust.works 1 points 6 days ago

for the site see if you can reissue the cert or try certbot if u already used certbot try manyally downloading the cert an pointibng to it

[-] Paddy66@lemmy.ml 1 points 6 days ago

The site is hosting by a hosting company - and they assure me that the cert is fine.

If I was self hosting I'd expect these problems, but not with a hosting company.

The only difference with this company is that they do not use any big tech infrastructure - they have their own servers. I wonder if big tech has something they don't.....?

[-] DrunkAnRoot@sh.itjust.works 1 points 4 days ago* (last edited 4 days ago)

idk for me it doesnt say a error just cannot complete request and https even though connections not secure its quite odd and i can use http for it an it works

[-] Paddy66@lemmy.ml 1 points 4 days ago

really? It works with just http? that is weird.

It suggests to me that the web hosting company we are using don't know what they're doing. We're going to change.

[-] DrunkAnRoot@sh.itjust.works 1 points 3 days ago* (last edited 3 days ago)

theres a lot of hosts you can find on https://kycnot.me/ if you need options still

[-] merde@sh.itjust.works 91 points 1 week ago

you should stop calling people "normies", if you want them to care about what you have to say

[-] kionite231@lemmy.ca 14 points 1 week ago

I call them normies not because I look down upon them or I hate them I do that because whenever I educate them to use privacy oriented services they mock me saying "you are crazy" "you aren't president" "nobody cares about your data" yada yada yada...

It makes me frustrated :(

load more comments (5 replies)
load more comments (4 replies)
[-] Courantdair@jlai.lu 33 points 1 week ago

Starting by not calling people that don't know/care about privacy "normies", and educating them I guess.

Also I'd say start with the "easier" ones, for instance anti-capitalist people are more open to find ways to avoid surveillance capitalism. If enough of these people care and educate their respective circles, eventually all people will care.

load more comments (3 replies)
[-] MoonlightFox@lemmy.world 26 points 1 week ago

I think certain arguments work, and certain don't.

I live in a very high trust society, Norway. This has a lot of advantages, but also some downsides.

We trust eachother, our neighbours, our government and our media. Which is fantastic, and well deserved. The government deserves the trust.

This makes it hard for me to make people realize how important privacy is, because they trust organizations with their data.

During COVID, Norway made their own app for tracking who met to prevent the spread. Of all the apps in the world, Norway wanted to push about the least privacy friendly app in the world. This from a country with the highest press freedom and rankings for democracy. Most people though it was fine, because why not? We trust our government.

https://www.amnesty.org/en/latest/news/2020/06/norway-covid19-contact-tracing-app-privacy-win/

Luckily someone protested enough, and it got scrapped for something better.

When I try to convince someone I have a couple of angles:

  1. You trust the government and organizations with your data today. But do you trust the government in 30 years? Because data is forever. The US has changed a lot in a very short time, this can happen here as well

  2. You have a responsibility for other peoples privacy as well. When you use an app that gets access to all your SMSes and contacts you spy on behalf of companies on people that might need protection. Asylum seekers from other countries for instance.

load more comments (8 replies)
[-] Maeve@kbin.earth 14 points 1 week ago

I have a feeling a whole bunch of people are about to start caring, when they see normal things being used as excuses to arrest friends, family, colleagues.

[-] Paddy66@lemmy.ml 2 points 6 days ago

I'm in the UK and there's a feeling amongst some that "we're next" if we don't curb the rise of the far right.

The Reform party's victories here this week are another alarm bell.

[-] Maeve@kbin.earth 2 points 6 days ago

I'd say those some are spot on. Governments love the "look what that country is doing!” while doing the same or worse, surreptitiously. Prestidigitation, really.

load more comments (6 replies)
[-] swordfish@lemmy.world 12 points 1 week ago

Maybe start by not calling them "normies".

load more comments (4 replies)
[-] themurphy@lemmy.ml 12 points 1 week ago

People want convinience. You'll never get people to do it, unless it personally affects them. Realisticly, you can convert a few.

But most importantly. It shouldnt be that hard to have privacy. THATS the problem. People shouldnt need to do alot of things to get it.

Do something about the problem (political, legally change privacy laws) instead of every single person.

But I know that can be near impossible depending of where you live.

load more comments (7 replies)
[-] hansolo@lemm.ee 10 points 1 week ago

There's several overlapping problems:

First, that the problem is complex. It's not just "Microsoft bad." There's a turducken lasagna of layered problems that make it hard for the average person to wrap their heads around the issue.

Next, there's no direct monetary incentive. You can't say "you lose $500 a year because data brokers know your address." Most people also have relied their whole lives on free email, so the average person in already in "debt" in terms of trade offs already.

You're also starting from a point of blaming the victim in a way. It's the same problem companies have with cybersecurity, blaming everyone except the executive that didn't know the risks of skimping on cyber budgets. Hiding the problem to avoid public shame is the natural human response.

Finally, that resolving the problem is fucking hard. I know, we all know, it's a constantly moving target that requires at the very least moderate technical skill. My partner wants to have more privacy online, but would rather have conveniences in many cases. And has zero patience for keeping up with changes, so I have to be a CISO for a household. So the average person, and the average household, does not have the skillset to care "effectively" if they wanted to.

load more comments (3 replies)
[-] FriendBesto@lemmy.ml 10 points 1 week ago

I have learned that the best game is simply not to play. You risk annoying the hell out of people. Let them get curious, maybe mention it but they have to come to you. Pushing it onto people who do not care is simply not worth it. You are wasting your time, this is real life. Some people will simply not want to care. It is their choice and sometimes that choice will not match yours.

The people I have so-called converted where people who actually were interest to know more. If you push it on people who are not interested then you risk being that annoying person who comes off as an activist or ideologue.

[-] jagged_circle@feddit.nl 10 points 1 week ago

Steal their identity and doxx them. They'll play along after that experience

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 03 May 2025
204 points (98.1% liked)

Privacy

37712 readers
537 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS