64
YouTube relaxes moderation rules to allow more controversial content
(www.techspot.com)
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
I find this hard to believe since it goes against my decades long personal experience using YouTube. The moment I click on a “Ben Shapiro destroys” video, sure - I get plenty more in my feed. But they also go away when I stop engaging. In my experience, YouTube does a great job of recommending me the kind of content I actually like to watch.
Uh huh. Which means they're being pushed to people they think will engage, which means they're being monetized or are at least considered monetizable if the creator isn't eligible. Like they said.
Well yeah, isn’t that the whole point of the recommendation algorithm? To suggest content people might find engaging. If a “Ben Shapiro destroys” video doesn’t break any rules, then what’s the issue with it being monetized? What I’m doubting here is the claim that this kind of content is somehow disproportionately pushed to people who have no interest in it.
The issue, my friend, is that such videos often do break the rules on hateful content and on misinformation (though those rules may have also been removed after 2020), but are still able to be monetized regardless.
Straw man. You may have had that argument with someone else, but no one on this comment chain ever made that claim but you.
Your personal experience validated their claim, what are you talking about
No, it doesn’t. If I watch a 15-second funny video from nine years ago, my feed gets flooded with other short clips like that - that’s just how the algorithm works. My personal experience doesn’t support the claim that right-wing media is being disproportionately pushed to people who aren’t interested in it. If I click on that kind of video, it means I’m interested in it - so of course I get recommended more.
Fair, I spoke too strongly. My bad!
I regularly get ads for airbrushed right wing talking heads arguing with college kids. That’s paid so not only the algorithm, but it is annoying!
They didn't ever make that claim, that's just your straw man.
Yes. That’s the algorithm.