Skip to main content

To: Richard Lappin, Head of EMEA Organic Content Policy @ Meta

Clean Up Content Moderation

Social media corporations are still keeping us in the dark over how they select what content we see online, choose to operate, and treat their workers.

Why is this important?

People working as content moderators keep our public spaces safe online. But they are often treated terribly by their employers. Content moderators have to spend hours each day looking at the worst videos and messages on the internet from torture and child abuse, to animal cruelty and murder. They look at this content so the rest of us don’t have to.

When online platforms rely on machines to do the work of real people, they make mistakes and infringe on our rights of what we’re able to say online. That’s why we’re calling for better treatment of these workers so that all of us benefit from safe online spaces.

We’re calling on Facebook to:
- Bring content moderation in-house and treat content moderators like regular Facebook staff
- Pay their content moderators better so they are less reliant on automated content moderation
- Provide them with a decent level of care, i.e. access to psychological care.
- Publish its content moderations policies in a transparent manner, including the result of any findings to see if content moderation practices and policies are in line with human rights standards
- Stop silencing content moderation whistleblowers who come forward detailing their experience and abuses

Notes:
Meta platforms like Facebook and Instagram continuously fail to detect misinformation for election ads: https://www.globalwitness.org/en/campaigns/digital-threats/facebook-fails-tackle-election-disinformation-ads-ahead-tense-brazilian-election/
There is a clear double-standard in our content moderation policies are applied - ie. under-moderated in Hebrew language, over-moderated in Arabic: https://www.thenationalnews.com/world/uk-news/2022/05/18/facebook-examines-moderation-policies-after-pressure-over-palestine-content/
Content moderators are made to work in terrible conditions, often with little to no concrete psychological support for the trauma they witness on the job: https://www.wired.co.uk/article/facebook-content-moderators-ireland
Content moderators who speak up about this may be silenced: https://www.breakingnews.ie/ireland/former-facebook-content-moderator-expresses-concern-for-safety-1268117.html

Updates

2022-09-11 07:18:38 +0100

1,000 signatures reached

2022-08-18 07:54:43 +0100

500 signatures reached

2022-08-17 17:10:12 +0100

100 signatures reached

2022-08-17 16:54:10 +0100

50 signatures reached

2022-08-17 16:47:18 +0100

25 signatures reached

2022-08-17 16:41:39 +0100

10 signatures reached