Instagram previously announced that it would be banning photos and videos related to self-harm and suicide from its platform and now the company is trying to do more to ban this type of content.
The company has announced that it will now remove any drawings and cartoons that feature this type of content.
This past month, we further expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery. We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods.
Accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like Explore. And we’ll send more people more resources with localized helplines like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the United States.
Expanding the Scope of Content Moderation
Instagram’s decision to broaden its content moderation policies comes in response to growing concerns about the impact of graphic and harmful content on mental health. The platform has recognized that even fictional depictions of self-harm and suicide can have a detrimental effect on users, particularly younger audiences who are more impressionable. By removing such content, Instagram aims to create a safer and more supportive online environment.
The inclusion of drawings, memes, and content from films or comics in the ban is a significant step. These forms of media can often be overlooked in content moderation efforts, yet they can be just as impactful as real-life images and videos. By addressing these areas, Instagram is taking a more comprehensive approach to safeguarding its users.
Support and Resources for Users
In addition to removing harmful content, Instagram is also focusing on providing support and resources to users who may be struggling with mental health issues. The platform will direct more people to localized helplines and support organizations. For instance, users in the UK will be connected with resources like the Samaritans and PAPYRUS, while those in the United States will have access to the National Suicide Prevention Lifeline and The Trevor Project.
This proactive approach ensures that users who may be in distress are not only shielded from harmful content but also have access to the help they need. By integrating these resources into the platform, Instagram is making it easier for users to seek assistance without having to leave the app.
Community and Algorithmic Changes
Instagram’s efforts to combat self-harm and suicide content also extend to its community and algorithms. Accounts that share such content will be demoted in search results and will not appear in discovery surfaces like Explore. This reduces the visibility of harmful content and discourages users from sharing it in the first place.
Moreover, Instagram is likely employing advanced algorithms and machine learning techniques to identify and remove harmful content more effectively. These technologies can scan vast amounts of data quickly, ensuring that harmful content is flagged and removed promptly.
Instagram’s expanded policies on self-harm and suicide content represent a significant step forward in creating a safer online environment. By banning not only real-life images and videos but also fictional depictions, the platform is taking a more holistic approach to content moderation. Additionally, by providing users with access to localized support resources and demoting harmful accounts, Instagram is addressing the issue from multiple angles.
You can find out more details about the changes Instagram is making to its platform over at its website at the link below.
Latest Geeky Gadgets Deals
Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.