Al-Shabaka Logo
Al-Shabaka Logo
Al-Shabaka Logo
Al-Shabaka Logo
Al-Shabaka Logo
Al-Shabaka Logo
Donate Sign Up
  • العربية
  • English
  • Policy Analysis
    • Civil Society
    • Economics
    • Politics
    • Refugees
    • Scenario Matrix
  • Policy Insights
    • Policy Focus
    • Policy Labs
    • Podcasts
  • Policy Network
    • Members
    • Contributors
  • About Us
    • About Us
    • Who We Are
    • Internship Program
    • Contact
    • Donate
    • Privacy & Terms of Use
  • Media & Outreach
    • Op-Eds & Articles
    • In the Media
    • Events
    • Press Releases
    • Press Contacts
Al-Shabaka Logo
Al-Shabaka Logo
  • Download
  • Facebook
  • Twitter
  • Linkedin
  • Email
  • Print

Read the Full Article

Executive Summary

YouTube’s Violation of Palestinian Digital Rights: What Needs to be Done

by Amal Nazzal  on December 27, 2020

Palestinians have been reporting increasing digital rights violations by social media platforms. In 2019, Sada Social, a Palestinian digital rights organization, documented as many as 1,000 violations including the removal of public pages, accounts, posts, publications, and restriction of access. This policy brief examines YouTube’s problematic community guidelines, its social media content moderation, and its violation of Palestinian digital rights through hyper-surveillance. 

It draws on research conducted by 7amleh - The Arab Center for the Advancement of Social Media, and on interviews with Palestinian journalists, human rights defenders, and international human rights organizations to explore YouTube’s controversial censorship policies. Specifically, it examines the vague and problematic definition of certain terms in YouTube’s community guidelines which are employed to remove Palestinian content, as well as the platform’s discriminatory practices such as language and locative discrimination. It offers recommendations for remedying this situation. 

In the Middle East, YouTube is considered one of the most important platforms for digital content distribution. Indeed, YouTube user rate in the region increased by 160% between 2017 and 2019, with over one million subscribers. Yet little is known about how YouTube implements its community guidelines, including its Artificial Intelligence (AI) technology which is used to target certain content. 

YouTube has four main guidelines and policies for content monitoring: spam and deceptive practices, sensitive topics, violent or dangerous content, and regulated goods. However, many users have indicated that their content has been removed without falling under any of these categories. This indicates that YouTube is not held accountable for the clarity and equity of its guidelines, and that it can maneuver between them interchangeably to justify content removal.

International human rights organization, Article 19, confirms that YouTube’s community guidelines fall below international legal standards on freedom of expression. In its 2018 statement, Article 19 urged that YouTube be transparent about how it applies its guidelines by providing examples and thorough explanations of what it considers to be “violent,” “offensive” and “abusive” content, including “hate speech” and “malicious” attacks.

A member of another human rights organization, WITNESS, explained how YouTube’s AI technology erroneously flags and removes content that would be essential to human rights investigations because it classifies the content as “violent.” As a case in point, Syrian journalist and photographer, Hadi Al-Khatib, collected 1.5 million videos throughout the years of the Syrian uprising which documented hundreds of chemical attacks by the Syrian regime. However, al-Khatib reported that in 2018, over 200,000 videos were taken down and disappeared from YouTube, videos that could have been used to prosecute war criminals.

This discrimination is particularly evident in the case of Palestinian users’ content. What is more, research clearly indicates that YouTube’s AI technology is designed with a bias in favor of Israeli content, regardless of its promotion of violence. For example, YouTube has allowed Orin Julie, Israeli gun model, to upload content which promotes firearms despite its clear violation of YouTube’s “Firearms Content Policy.”

Palestinian human rights defenders have described YouTube’s discrimination against their content under the pretext that it is “violent.” According to Palestinian journalist Bilal Tamimi, YouTube violated his right to post a video showing Israeli soldiers abusing a twelve year-old boy in the village of Nabi Saleh because it was “violent.” In the end, Tamimi embedded the deleted video into a longer video which passed YouTube’s AI screening, a tactic used to circumvent the platform’s content removals.

More specifically, Palestinian human rights defenders reported experiencing language and locative discrimination against their content on YouTube. That is, YouTube trains its AI algorithms to target Arabic-language videos disproportionately in comparison to other languages. In addition, YouTube’s surveillance AI machines are designed to flag content emerging from the West Bank and Gaza. And the more views Palestinian content receives, the more likely it will be surveilled, blocked, demonetized, and likely removed. 

To counter these discriminatory practices and protect Palestinian activists, journalists, and human rights defenders on YouTube, the following recommendations should be implemented:

  • YouTube community guidelines must respect human rights law and standards, and they must be translated into multiple languages, including Arabic. 
  • A third party, civil society monitoring group should ensure that AI is not hyper-surveying Palestinian content and discriminating against it. It should also support users in appealing the removal of their content. 
  • YouTube should publish transparency reports for its processes of restricting user content. It should also explain how a person can appeal the restrictions.
  • The Palestinian Authority should support Palestinian users’ legal cases against YouTube and other social media platforms.
  • Palestinian civil society organizations should raise awareness about digital rights.
  • Activists, journalists, and human rights defenders should share strategies for evading language, locative, and other forms of discrimination. They should also work to develop technologies to reverse YouTube’s biased AI technology. 

An independent, non-partisan, and non-profit organization whose mission is to educate and foster public debate on Palestinian human rights and self determination within the framework of international law. Al-Shabaka materials may be reproduced and circulated with due attribution to Al-Shabaka: The Palestinian Policy Network. The opinions of individual members of Al-Shabaka’s policy network do not necessarily reflect the views of the organization as a whole.

  • Policy Analysis
    • Civil Society
    • Economics
    • Politics
    • Refugees
    • Scenario Matrix
  • Policy Insights
    • Policy Focus
    • Policy Labs
    • Podcasts
  • Policy Network
    • Members
    • Contributors
  • About Us
    • Who We Are
    • Internship Program
    • Contact
    • Donate
    • Privacy & Terms of Use
  • Media & Outreach
    • Op-Eds & Articles
    • In the Media
    • Events
    • Press Releases
    • Press Contacts
  • Contact
    • Contact al-Shabaka by email at:
      [email protected]
    • Or by mail:
      Al-Shabaka: The Palestinian Policy Network
      P.O. Box 8533
      New York, NY 10150

© 2010-2023 Al-Shabaka: The Palestinian Policy Network. All rights reserved.

×