Instagram has implemented a program that can identify misleading posts and content and report it to Facebook’s army of fact checkers, according to Poynter. The company has been executing strategy to identify and diminish misleading content by demoting its reach within the websites discover platforms. When potentially misleading content is identified by the algorithm, it will flag the post. The flagged post is then sent to Facebook’s army of fact checking employees, whom will verify whether or not the post is real or fake. Fake posts won’t be removed, but they may not show up in the Explore pages or in hashtags.
How Misleading Content Is Identified
On Instagram, misleading content may be harder to identify through a simple search algorithm because so much of the content is photo-based. This requires an algorithm to be able to look through photo content to identify potentially misleading information. That content may be in the form of a photoshopped celebrity photo. It could also look like an altered image that suggests something may be true when it isn’t. Flagged content is then sent to Facebook’s team of fact-checking partners, which have been employed across 52 organizations in over 30 countries. These organizations specialize in making sure that misleading content isn’t spread—whether it be political propaganda or a celebrity rumor.
Facebook announced an initiative to combat fake news back in 2016. The U.S. election brought the concept of fake news into the attention of the mainstream media. Social networks like Facebook and Instagram were at the helm of how that news was being spread. A company as big as Facebook couldn’t risk losing consumer trust to an issue like fake news. So it tackled it head on by employing third party partners around the world to combat local issues within their respective countries. In Myanmar, the social media platform has acknowledged its role in the Rohingya genocide that’s still happening to this day. To combat the problem, the company is strategizing how to identify and combat propaganda. It did this through the help of trusted government officials and experts on Burmese relations.
What Happens To It?
Instagram and Facebook may not be removing misleading content. But they’re taking a page out of YouTube‘s book regarding what to do with it. Instead of flagging it as inappropriate and taking it down, as posts with things like nudity are likely to see, the company said it will simply not be suggesting the posts and demoting their reach within the algorithm. Many users already struggling to get their posts seen. So this may make it that much harder. However, Instagram is confident that the strategy will help diminish the spread of false information. It may also encourage users not to post it in the first place.
The company says that it’s been using this strategy on content that’s made it onto Facebook from Instagram since the midterm election. Now that they’ve gotten a hold on that, though, the company wants to further the strategy. Instagram will be implementing the same strategy to all of its content as soon as next week.