Online content moderation
*I saw a panel discussion on this at MozFest 2019, and also watched The Cleaners documentary.
It's pretty grim stuff - both the distressing content that is created and uploaded around the world, and the way in which the people that are contracted to moderate this content are treated.
1 Overview
- the job of content moderation of big online platforms
- it is usually outsourced to other companies
- low paid employees do the act of moderation
2 Who is setting the policies?
With their policies of what content is acceptable or not, the big tech firms are in some way determining what is acceptable to society.
- basically three/four private companies in the US
- e.g. terrorist groups as designated by US homeland security are used as guidelines for content moderation
- and they do it only based on on profit motives - e.g. reacting to bad publicity
3 Conditions for workers
3.1 Conditions
- high level of things to see per day
- Chris Gray, who worked in Ireland as a content moderator, I think suggested around 600 items a day
- 90% of it might be mundane, around 10% of it will be traumatic
- on The Cleaners documentary, I think they said in the Phillipines it's a target of around 20-25,000 every day??
- Chris Gray, who worked in Ireland as a content moderator, I think suggested around 600 items a day
- monitoring
- workers are monitored to see if they are making 'correct' judgements
- have to meet a quality target otherwise their employment is in jeopardy
- post traumatic stress
- stress of seeing disturbing things, stress of precarious labour, stress of having to determine what is good, what is bad
3.2 Pay
- at MozFest panel Cleaners directors said payment in the Phillipines is $1-$3 a day
- Chris Gray said in Ireland around 12 euro a day I think?
- contrast both of these with the salary of a Facebook engineer…
- question: is it different types of content in different locations?
3.3 Support
- content moderators are under NDAs
- they can't talk about it with anyone, including friends/family
- talking about it would help with the trauma
4 Use AI instead?
Why not use ML/AI to moderate this content?
- AI can't handle the level of complexity involved in some of the decisions
- dilemma: it would put people out of work - but it is unpleasant work
- even for the unpleasant stuff, human input would be required to train any machine learning process anyway
5 Legal action
- currently a legal action being taken against Facebook by Chris Gray and others
6 References
- The Cleaners (film)
- MozFest 2019 panel (probably will be online at some point)
7 Misc
The central problem is that Facebook has been charged with resolving philosophical conundrums despite being temperamentally ill-qualified and structurally unmotivated to do so.
If nudity can be artistic, exploitative, smutty, and empowering, then the depiction of violence can be about hate and accountability. A video of a shooting can be an expression of deadly bigotry, but it can also expose police wrongdoing. Distinguishing between them requires human decision-making, and resolving a range of contested ideas. At present, private institutions bear significant responsibility for defining the boundaries of acceptability, and they are not very good at it.
8 Elsewhere
8.1 In my garden
Notes that link to this note (AKA backlinks).