Palestine Action ban coupled with Online Safety Act ‘a threat to public debate’

The Online Safety Act together with the proscription of Palestine Action could result in platforms censoring Palestinian-related content, human rights organisations have warned.

Open Rights Group, Index on Censorship and others have written to Ofcom calling on it to provide clear guidance to platforms on distinguishing lawful expression from content deemed to be in support of terrorism.

They say failure to act by the regulator act risks misidentification – including through algorithms – of support for Palestine as support for Palestine Action, which on 5 July became the first direct action protest group to be banned under UK anti-terrorism laws.

It also runs the risk of misidentifying objections to Palestine Action’s proscription as unlawful support for the group, the signatories claim.

Sara Chitseko, a pre-crime programme manager at Open Rights Group, said: “Crucial public debate about Gaza is being threatened by vague, overly broad laws that could lead to content about Palestine being removed or hidden online. There’s also a real danger that people will start self-censoring, worried they might be breaking the law just by sharing or liking posts related to Palestine and non-violent direct action.

“This is a serious attack on freedom of expression and the right to protest in the UK. We need to ensure that people can share content about Palestine online with being afraid that they will be characterised as supportive of terrorism.”

The organisations’ concerns are exacerbated by Ofcom’s advice that platforms can avoid worrying about their duties under the Online Safety Act (OSA) if they ensure they are more censorious than the act requires. “This approach risks encouraging automated moderation that disproportionately affects political speech, particularly from marginalised communities, including Palestinian voices,” the letter says.

Unlike in the EU, there is no independent mechanism for people in the UK to challenge content they feel has been wrongly taken down. The signatories want platforms – the letter has also been sent to Meta, Alphabet, X and ByteDance – to commit to an independent dispute mechanism, if evidence emerges of lawful speech being suppressed.

The letter, also signed by Electronic Frontier Foundation in the US and organisations from eight European countries, as well as experts and academics, says: “We are concerned that the proscription of Palestine Action may result in an escalation of platforms removing content, using algorithms to hide Palestine solidarity posts and leave individuals and those reporting on events vulnerable to surveillance or even criminalisation for simply sharing or liking content that references non-violent direct action.

“We are also concerned about what platforms understand by their legal duties regarding expressions of ‘support’ for Palestine Action.”

The letter comes a week after the OSA’s age-gating for “adult” material came into effect, prompting fears about access to Palestine-related content. For example, Reddit users in the UK have to verify their age to access the Reddit sub r/israelexposed.

Ella Jakubowska, the head of policy at EDRi in Brussels, said there would inevitably be suppression of “critical voices, journalism and social movements around the world. The problem is worsened by automated content moderation systems, well known for over-removing content from Palestinian creators, in support of Black Lives Matter, about LGBTQI+ issues and more.

“It is very likely that in trying to comply with these requirements, platforms would unjustly remove content from people in the EU and other regions.”

She said that would contravene laws such as the EU Digital Services Act, designed to strike a balance between keeping people safe online and freedom of expression.

An Ofcom spokesperson said: “We have provided detailed guidance to platforms about how to identify the particular types of illegal and harmful material prohibited or restricted by the act, including how to determine whether content may have been posted by a proscribed organisation.

“There is no requirement on companies to restrict legal content for adult users. In fact, they must carefully consider how they protect users’ rights to freedom of expression while keeping people safe.”

Meta, Alphabet, X and ByteDance were all approached for comment.

This post was originally published on this site

Share it :