Instagram is taking extra steps to make “potentially harmful” content less visible in the app. The company says that the algorithm gehind the way posts are ordered in users’ feeds and in Stories will now de-prioritize content that “may contain bullying, hate speech or may incite violence.”
Instagram’s rules already prohibit this content, but the change could affect those borderline posts or content that hasn’t yet reached moderators. The company says, “To understand if something may break our rules, we’ll look at things like if a caption is similar to a caption that previously broke our rules,”
Thus far, Instagram has tried to hide potentially objectionable content from more public-facing parts of the app, but it hasn’t changed how it appears to users who follow the accounts posting these types of posts. Now posts that are “similar” to those that have been previously removed will be less visible even to an account’s followers. A spokesperson for Meta said that “potentially harmful” posts could still be eventually removed if the post breaks any community guidelines.
There was a similar change in 2020, when Instagram began down-ranking accounts that shared misinformation that was debunked by “fact-checkers”. But in this case, Instagram says that the latest policy will only affect individual posts and “not accounts overall.” It will be interesting to see how well this works.