YouTube has policies which are pretty clear cut, but there are times where one might upload a video that is causing a lot of controversy, but yet doesn’t exactly violate YouTube’s policy. What do you do in this situation? Not an easy thing to fix.
People might be offended by the video, but YouTube can’t take it down. Well, YouTube has come up with a solution to this problem.
They will isolate these videos in a “limited state”. YouTube says, “We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state”. They also add that, “The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter.”
So it seems like they have solved this common problem. These videos will not be discoverable by the general YouTube community, which means that at least on the surface, YouTube will be protecting its users from such content. Their aim is to keep its platform as “safe” as possible. Interesting solution.