/ 

PASSWORD RESET


REGISTER



Technology news site

Facebook Says It’s Working To Limit Misinformation On Its Platform

Share

Over the years, Facebook has grappled with the spread of misinformation about politics, violence, hate speech and controversial content. The social network has tried various means to reduce materials that are against its policies; however, the role of artificial intelligence is not satisfactory as some materials still manage to bypass the set standards.

For instance, the company alleges that its AI systems failed to catch the live video of the shooting that left about 50 people dead. It also claimed that its AI is not able to remove similar videos that have been edited.  This is the case with hate speeches that have misspelled words and unfamiliar phrases which the AI cannot identify.

The company announced on Wednesday over 12 updates about how it is addressing the challenges with the spread of misinformation and hate speeches. It was a 4-hour event that took place at its Menlo Park headquarters that engaged twenty reporters for Facebook products with answered questions. Facebook says it cannot just take down information on its platform, so it doesn’t appear like a fight against freedom of speech.

Instead of removing the content, the third party fact checkers will limit its distribution by pushing it lower on news feeds. Facebook’s VP of integrity Guy Rosen said at the event:

We don’t remove information from Facebook just because it’s false. We believe we have to strike a balance. When it comes to false information by real people, we aim to reduce distribution and provide context.”

Another recent update is creating a badge for celebrities to end impersonation. In the past, some people have disguised to be certain public figures to trick unsuspecting users on the platform. Facebook says a verified badge will show up in chats on its Messenger platform to enable users to suspect questionable profiles without the badges and report scammers who impersonate public figures.

Forward Indicator is another tool invented to tackle the spread of misinformation. This function already exists in WhatsApp. If an individual forwards a message to another user on the platform, a forward indicator will pop up in the recipient’s end, so they can know it’s not a direct but a forwarded message. The same function is now applicable in Messenger. WhatsApp and Messenger have had issues with false viral hoax news.

Forester’s Murphy thinks that Facebook should concentrate more on bigger problems such violence being live-streamed and harmful materials that trigger suicide. Last month, a gunman live-streamed a shooting, and the video already went viral before the social network acted. A teenager, last year committed suicide after following a series of harmful content that promoted self-harm on Instagram. The father blamed Mark Zuckerberg, Facebook founder.

HTML Snippets Powered By : XYZScripts.com