Instagram Bans Self-Harm Images After Teen’s Suicide

Facebook, which owns Instagram, will use its image recognition technology to help the social media app implement its new self-harm policy.

Instagram Bans Self-Harm Images After Teen’s Suicide

A father says content his daughter viewed on Instagram contributed to her suicide. (Image: iStock.com/TARIK KIZILKAYA)

Instagram has agreed to ban graphic images related to self-harm after a father claims the social media app contributed to his daughter’s suicide.

Instagram chief Adam Mosseri made the announcement Thursday, stating the company will also ban non-graphic, self-harm related content in its search feature and through hashtags, reports CBS News.

“We need to do more to consider the effect of these images on other people who might see them. This is a difficult but important balance to get right,” he said. “We will get better and we are committed to finding and removing this content at scale, and working with experts and the wider industry to find ways to support people when they’re most in need.”

Mosseri added the company will not completely ban non-graphic, self-harm content because “we don’t want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help.”

Facebook, which owns Instagram, said in a statement that independent experts have advised them to “allow people to share admissions of self-harm and suicidal thoughts but should not allow people to share content promoting it.”

Facebook will also use its investment in image recognition technology to help Instagram implement its new policy, reports The Guardian.

“The more people report such images to the platform, the better the algorithm becomes in recognizing such images and becomes quicker in removing them,” said cybersecurity expert Jake Moore. “It is, therefore, a joint effort from both Instagram and its users to remove self-harm images, which will take time.”

The call for change was supported by the British government after the family of 14-year-old Molly Russell found material related to depression and suicide on her Instagram account. She took her own life in 2017.

Her father, Ian Russell, said the content his daughter viewed on Instagram played a role in her suicide.

The changes were announced after the company and other tech firms, including Facebook, Snapchat and Twitter, met with British Health Secretary Matt Hancock and representatives from the Samaritans, a mental health charity that works to prevent suicide.

About the Author

Contact:

Amy Rock is Campus Safety's senior editor. She graduated from UMass Amherst with a Bachelor’s Degree in Communications and a minor in Education.

She has worked in the publishing industry since 2011, in both events and digital marketing.

Read More Articles Like This… With A FREE Subscription

Campus Safety magazine is another great resource for public safety, security and emergency management professionals. It covers all aspects of campus safety, including access control, video surveillance, mass notification and security staff practices. Whether you work in K-12, higher ed, a hospital or corporation, Campus Safety magazine is here to help you do your job better!

Get your free subscription today!


Leave a Reply

Your email address will not be published. Required fields are marked *

Get Our Newsletters
Campus Safety HQ