64
YouTube relaxes moderation rules to allow more controversial content
(www.techspot.com)
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
I feel like the issue people her take with this is rather "great, even more Nazi content" than "I want censorship". YouTube already has an issue with demonetizing content it deems risky for its ad business, like curse words or the mention of violence while allowing inflammatory content that drives engagement
I’d imagine that the inflammatory content in question mostly gets demonetized just the same, so I don’t really see what the issue is. It’s not like a specific kind of content is being treated differently, or is it?
It does not, because demonetized content is also no longer pushed by the algorithm. Since the right wing stuff still gets pushed to the front page and recommendations, it probably did not get demonetized
I find this hard to believe since it goes against my decades long personal experience using YouTube. The moment I click on a “Ben Shapiro destroys” video, sure - I get plenty more in my feed. But they also go away when I stop engaging. In my experience, YouTube does a great job of recommending me the kind of content I actually like to watch.
Uh huh. Which means they're being pushed to people they think will engage, which means they're being monetized or are at least considered monetizable if the creator isn't eligible. Like they said.
Well yeah, isn’t that the whole point of the recommendation algorithm? To suggest content people might find engaging. If a “Ben Shapiro destroys” video doesn’t break any rules, then what’s the issue with it being monetized? What I’m doubting here is the claim that this kind of content is somehow disproportionately pushed to people who have no interest in it.
The issue, my friend, is that such videos often do break the rules on hateful content and on misinformation (though those rules may have also been removed after 2020), but are still able to be monetized regardless.
Straw man. You may have had that argument with someone else, but no one on this comment chain ever made that claim but you.
Your personal experience validated their claim, what are you talking about
No, it doesn’t. If I watch a 15-second funny video from nine years ago, my feed gets flooded with other short clips like that - that’s just how the algorithm works. My personal experience doesn’t support the claim that right-wing media is being disproportionately pushed to people who aren’t interested in it. If I click on that kind of video, it means I’m interested in it - so of course I get recommended more.
Fair, I spoke too strongly. My bad!
I regularly get ads for airbrushed right wing talking heads arguing with college kids. That’s paid so not only the algorithm, but it is annoying!
They didn't ever make that claim, that's just your straw man.
Yes. That’s the algorithm.