How is Facebook specifically hurting women?
In an open letter to Facebook, various women’s groups and feminist sites are urging people to contact advertisers whose ads appear next to content that targets women for violence, to ask these companies to withdraw from advertising on Facebook until Facebooks ban gender-based hate speech on its site.
We reported on the letter last week, but thought we would post here just exactly how Facebook is specifically hurting women.
There are, say Women’s Action Media (WAM), four primary ways.
For a start, Facebook’s own rules prohibit hate speech, so the objection here is to Facebook’s inconsistency in deciding what constitutes hate speech, and who is treated as a valid target of hate.
The company’s moderators deal with content that is violently homophobic, Islamophobic, and anti-Semitic every day.
So, for example,“I hate Muslims” or “I hate Jews” is not allowed. Which is of course excellent.
However, pages that express similar sentiment, but are targeted at girls and women – and often through the use of misogynistic and sexist language – are not considered hate speech.
Secondly, Facebook employs a grossly sexist double standard when it comes to representations of women’s bodies.
And I mean ‘gross’ly.
Photographs that are pornographic – also not allowed, according to Facebook’s terms – and display women’s bodies as sex objects, remain on the site, even when they have been reported.
And these include pictures of women fully exposed, women tied up, women in extreme pain, dead women and abused women.
But content posted by groups that represent women’s bodies for other purposes – either for health and education, in art, or part of a political protest – are rejected and removed.
For example, breast-feeding mothers, or pictures of placentas or medical illustrations of women’s sexual and reproductive organs are regularly banned.
As MotherWise found out recently, for example.
The point being that Facebook moderators remove content created by women for women – but routinely allow photographs of girls and women to be used without their consent for the purposes of harassment, bullying, slut-shaming and sexual ranking, review and commentary.
Third point: Facebook excludes women’s speech as ‘political’ while arguing that it is dedicated to enabling free speech in the service of social justice.
A common refrain in Facebook’s defence of its commitment to allow certain content, including content containing graphic violence – also in violation of its own terms – is that the company wants to reveal the real world and help catalyse social change.
But Facebook routinely penalises activist feminists on the site by removing their content, suspending their accounts, and disabling their links, as in the case of the Uprising of Women in the Arab Word.
Or, as another example, in the way Hildur Lilliendahl, an Icelandic feminist, was treated.
She established a page to protest images like a picture of a woman in her underwear, tied up with ropes, gagged with an apple like a suckling pig on a spit, suspended from a long metal pole carried in procession by a gang of men.
The caption to the picture read: “Feminist found in town this morning – captured and put on the grill.”
When Lilliendahl reposted a threat made against her, her page was suspended, and her account was blocked at least four times.
The threats stayed up.
Facebook eventually apologised, but made no statement regarding how its policies or processes were changed to make sure this does not happen again.
Other examples involving similar circumstances include: Rapebook, The Uprising of Arab Women, Women on Waves, The Girls Guide to Taking Over the World, Rabid Feminist, Thorlaug Agustsdottir, Mama to Mama, Feminists at Sea.
In many instances, after the fact, Facebook apologised and restored the account or content.
However, there does appear to be a bias in favour of censoring the political speech of women, while allowing hate speech against women.
And fourthly: the way in which Facebook’s moderation process is structured.
It treats each incident and report as isolated and unrelated on a case-by-base basis, and in doing so fails to address the overall environment created by this failure – namely one of harm and hostility towards women users in which we are disproportionately silenced.
Women’s groups – and this campaign – focus on ‘girls and women’ and not on ‘violence in general’ because women are universally marginalised and subjected to high levels of violence.
And Facebook’s approach, WAM points out, exacerbates this reality.
In order to effectively and fairly enforce its own guidelines, Facebook really should not ignore the effect of speech on the social status of women as a group.
Click here to find out how you can join us and help us persuade Facebook to address the representation of rape and violence on its site properly – with comprehensive and effective action.
Facebook is now claiming that because they took down all the images in WAM’s original sample set, there is no more problem.
This is not the case.
Until Facebook recognises that the problem is their policies and procedures, not just individual pages, WAM will keep posting fresh examples that are still live each day.
To see some examples of gender-based hate speech on Facebook click here. Trigger warning: they are vile and very disturbing.
Many of these have already been reported and allowed by Facebook moderation.
They may come down now when we shame Facebook with them, but that will not solve the problem.
Only new Facebook policies and procedures designed to ban gender-based hate speech will solve the problem.
To help, join in.