Instagram updates its community guidelines, warns users before account deletion

Instagram logo: a white outline of a camera with a circular lens and a dot in the top right corner, set against a gradient background transitioning from purple to pink to orange and yellow. The background is black, echoing Instagram community guidelines that stress creativity within boundaries.

Instagram has announced changes to the way it handles account deletion.

In a blog post, the giant outlined its updated rules and clarified the way it handles deleting accounts for offences such as hate speech, bullying, nudity, and inappropriate content.

The new changes are designed to “quickly detect and remove accounts that repeatedly violate our policies.”

The biggest change in the company’s policy is that users will now have their accounts deleted if they violate a certain number of policies in a specific timeframe, though the company, of course, remained coy on those numbers.

This new change is in addition to the company’s existing violations policy.

Speaking of the changes, a spokesperson for Instagram said: “Under our existing policy, we disable accounts that have a certain percentage of violating content.

“We are now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove accounts with a certain number of violations within a window of time.

“Similarly to how policies are enforced on Facebook, this change will allow us to enforce our policies more consistently and hold people accountable for what they post on Instagram.”

Instagram is now going to send users notifications to warn them that their account will be deleted if they violate the user agreement, and allow them to appeal those violations.

“We are also introducing a new notification process to help people understand if their account is at risk of being disabled,” the company added.

“This notification will also offer the opportunity to appeal content deleted.

“To start, appeals will be available for content deleted for violations of our nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies, but we’ll be expanding appeals in the coming months.”

Instagram has been under fire for a number of years after several users have committed suicide because of posts or bullying on the platform.

Politicians, celebrities, and campaigners have argued the social network has not been doing enough to protect young and vulnerable users.

This is Instagram’s latest step towards tightening its policies and making it harder for bullies and inappropriate content to slip through the net.

Earlier in the month, the company began testing a new way for users to mute or hide comments from specific users, and a new tool to warn users that their comment could be offensive and to think twice before posting.

Whether this is enough to deter hate speech and inappropriate comments, however, remains to be seen.

What are your thoughts on these changes? Let us know on Twitter using @AppleMagazine and remember to follow AppleMagazine on Instagram for our weekly cover story reveals.

Tagged:
Newsroom
About the Author

News content on AppleMagazine.com is produced by our editorial team and complements more in-depth editorials which you’ll find as part of our weekly publication. AppleMagazine.com provides a comprehensive daily reading experience, offering a wide view of the consumer technology landscape to ensure you're always in the know. Check back every weekday for more. Editorial Team | Masthead – AppleMagazine Digital Publication