Instagram Bans “Suicide-Like” Cartoons and Memes, But Is This Enough?

Published on November 4, 2019

Last week, Instagram announced that it has expanded the scope in which its content ban applies to a broader range of content, including cartoons, or fictional illustrations of self-harm and suicide methods such as drawings and memes. This does not include posts that discuss recovery and share steps of progress. Information on how the algorithm will work has not yet been disclosed.

In February, Instagram announced that it would begin prohibiting graphic images of self-harm, such as cutting, and restricting access to non-graphic self-harm content, such as images of healed scars—by not recommending it in searches.

Since February, the social media platform unfortunately has doubled the amount of self-harm content it has acted on, Adam Mosseri, head of Instagram, explained in a recent statement. Mosseri, formerly Facebook’s vice-president of News Feed, has had extensive experience in addressing social media abuse, particularly while he was responsible for Facebook’s News Feed.

We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery,” said Mosseri in his blog post. “We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods.

Learning from the Misgivings of Facebook’s “News Feed”

However, Instagram’s head has shared some of the lessons he learned as a result of developing the News Feed feature that has unfortunately amplified misinformation, propaganda, and 2019’s era of “fake news.”

“In the early days, we were so excited and optimistic about the value that comes from connecting people and, just quite frankly, under-focused and somewhat naïve about the negative consequences of connecting people at scale,” Mosseri shared in an interview with NPR back in September.

In his interview, Mosseri emphasized that it’s more than just investing time into one product or feature over another—rather, it’s about homing in on the “cultural shift, where you have to shift people’s mindsets.”

“When [people] have an idea that they’re excited about, you need to get them to a place where they naturally not only think about all the good that can come from it, but how that idea might be abused…and that’s the major lesson for me.”

The Cases Are Coming In

Instagram’s recent ban expansion comes months after Mosseri met with the U.K,’s health secretary to discuss the platform’s policy in addressing self-harm content.

The U.K. has been particularly harsh towards Instagram following a public outcry after the 2017 death of 14-year-old U.K. schoolgirl, Molly Russell who killed herself after viewing suicide-related content on the platform.

Russel’s family went public with the tragedy and spoke with BBC, bringing light to a very serious loophole in February’s ban, where the platform had definitely played a role in their daughter’s decision to kill herself.

Another incident happened back in May this year where a 16-year-old girl killed herself in Malaysia, after posting a poll on her Instagram account asking her followers if she should die or not, with 69% of respondents voting that she should. Seriously, what the fuck.

The young-girl’s death prompted an investigation on whether those who voted for her to die could be held responsibly for aiding and/or abetting in her suicide.

But Is This Enough?

In response to the investigation, Ching Yee Wong, Head of Communications at Instagram APAC said that the company has a “deep responsibility to make sure people using Instagram feel safe and supported.”

But that’s just it, if you ask most users to respond to that, they will most likely answer in the negative.

“As part of our own efforts, we urge everyone to use our reporting tools and to contact emergency services if they see any behavior that puts people’s safety at risk.”

Again, they do and it doesn’t come with much assistance on behalf of the user. It’s automated and very little human interaction. I mean shit, there’s not even a person you can contact at Instagram in response to this up and until they send you a bullshit automated email that your request has been looked into, and no violation and/or issue can be addressed.

Indeed, Mosseri back in February even admitted that the company “[does] not yet find enough of these images before they’re seen by other people,” relying heavily on its community to report the content and remove it. He then conceded that Instagram has shifted the burden for policing and monitoring content to the users themselves.

As I stated already, this still continues to be the case—users have very little bargaining power when it comes to reaching out to Instagram Support, as most response to users are automated.

Three months after Instagram enacted its ban, the platform has doubled the amount of self-harm content, “remov[ing], reduc[ing] the visibility of, or add[ing] sensitivity screens to more than 834,000 pieces of content.”

While the new policy is currently in effect, it’s not clear how long it could take to effectively be enforced, Mosseri told BBC News. “It will take time to fully implement,” he stated before adding that “it’s not going to be the last step” the company takes.

I think the solution to most of Instagram’s problems surrounding user content is to actually attempt to define “content guidelines” for what Instagram believes to be inappropriate and dangerous, in whatever that capacity may be, to the general public. Of course it will open the flood gates to scrutiny and individuals who don’t think it’s necessary, but I would beg to differ—Facebook and Instagram Support should not be automated any longer—it is an easy escape for the company not wanting to address the very reason it’s been successful—its global user base.

But getting our approach right requires more than a single change to our policies or a one-time update to our technology,” Mosseri ended his blog post. “Our work here is never done. Our policies and technology have to evolve as new trends emerge and behaviors change.

And I would agree.

Andrew "Drew" Rossow is a former contract editor at Grit Daily.

Read more

More GD News