Online bullying can be a major problem and now Instagram is launching some new tools in an attempt to combat bullying on their platform.
The company will give users the power to shadow ban people from their account, this means that any comments left by this person are not visible.
They are also launching tools which will recognize when someone could be publishing an abusive or bullying comment, Instagram will then ask the person if they want to publish this comment.
In the last few days, we started rolling out a new feature powered by AI that notifies people when their comment may be considered offensive before it’s posted. This intervention gives people a chance to reflect and undo their comment and prevents the recipient from receiving the harmful comment notification. From early tests of this feature, we have found that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect.
Understanding Shadow Banning
Shadow banning is a method that allows users to restrict the visibility of certain individuals’ comments without their knowledge. When a user shadow bans someone, the banned individual’s comments will only be visible to themselves and not to the wider audience. This can be particularly effective in reducing the impact of persistent trolls or bullies who thrive on public reactions. By implementing this feature, Instagram aims to empower users to take control of their own online experience without escalating conflicts.
AI-Powered Comment Moderation
The AI-powered comment moderation tool is another significant step in Instagram’s fight against online bullying. This tool uses advanced algorithms to detect potentially harmful or offensive language in comments before they are posted. When the AI identifies a comment that may be considered bullying, it prompts the user with a notification asking if they are sure they want to post it. This moment of reflection can often lead to the user reconsidering their words and choosing to either edit or delete the comment.
This feature is not just about preventing harmful comments from being posted; it also serves an educational purpose. By prompting users to think twice about their words, Instagram hopes to foster a more respectful and considerate online community. Early tests of this feature have shown promising results, with many users opting to change their comments to something less hurtful after receiving the notification.
Additional Measures and Future Plans
In addition to shadow banning and AI-powered comment moderation, Instagram is exploring other measures to combat online bullying. These include enhanced reporting tools that make it easier for users to report bullying and harassment, as well as increased support for victims of online abuse. Instagram is also working on educational initiatives to raise awareness about the impact of online bullying and to promote positive online behavior.
Moreover, Instagram is collaborating with mental health organizations and experts to ensure that their anti-bullying tools are effective and supportive. By involving professionals in the development and implementation of these features, Instagram aims to create a safer and more inclusive platform for all users.
You can find out more details about these new tools that Instagram will be testing out over at their website at the link below.
Source Instagram
Latest Geeky Gadgets Deals
Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.