YouTube’s Alleged Blocking of Black Creators’ Content is “Overt, Intentional & Systematic Racism,” Per New Suit

Image: Unsplash

Law

YouTube’s Alleged Blocking of Black Creators’ Content is “Overt, Intentional & Systematic Racism,” Per New Suit

YouTube “knowingly, intentionally, and systematically” uses algorithms to “restrict access and drive [Black creators] off” its platform. That is what four “African American content creators, viewers, and consumers” assert in the lawsuit that they filed against ...

June 22, 2020 - By TFL

YouTube’s Alleged Blocking of Black Creators’ Content is “Overt, Intentional & Systematic Racism,” Per New Suit

Image : Unsplash

Case Documentation

YouTube’s Alleged Blocking of Black Creators’ Content is “Overt, Intentional & Systematic Racism,” Per New Suit

YouTube “knowingly, intentionally, and systematically” uses algorithms to “restrict access and drive [Black creators] off” its platform. That is what four “African American content creators, viewers, and consumers” assert in the lawsuit that they filed against YouTube and its parent companies Google and Alphabet (the “defendants”) in a federal court in northern California, accusing the tech giants of engaging in the “unlawful conduct of restricting access to the YouTube platform based on the profiling and discriminatory use of a person’s personal identity.” 

According to their June 16 complaint, Kimberly Carleste Newman, Lisa Cabrera, Catherine Jones, and Denotra Nicole Lewis (the “plaintiffs”) assert that YouTube and its owners – who are “members of the largest business enterprise, private or public, in the world” – have carried out a scheme of “overt, intentional, and systematic racial discrimination” to “rig the game [and] use their power to restrict and block the plaintiffs … based on racial identity or viewpoint discrimination for profit.” 

In short, the plaintiffs argue in their 239-page filing that YouTube and its parents – which maintain “complete, absolute, and ‘unfettered’ control over access to approximately 95 percent of all video content that is available to the public” – have “knowingly, intentionally, and systematically employed artificial intelligence, algorithms, computer and machine-based filtering and review tools to ‘target’ the plaintiffs and all other persons similarly situated” based on their race. 

To be specific, the plaintiffs assert that “under the pretext of finding that videos violate some vague, ambiguous, and non- specific video content rule,” YouTube and co. use their algorithms to “censor,” “restrict” or outright remove content from “African American, Black, [or other] members of a protected racial classification … wholly or in part, because [of their]” race or identity. 

“The pattern and practice” of YouTube allegedly “denying users equal access to [its platform] based on their racial, sexual, or other individual identities or viewpoints” has become so “pervasive that many prominent and quality content creators have lost more than 90 percent of their viewers, advertisers, revenue, and other access rights in the last 24 months solely because they are identified as African American, LGBTQ+ or other protected racial classifications under the law,” the plaintiffs continue on the assert. 

For instance, one of the plaintiffs, Catherine Jones, argues that she is the creator and owner of a range of series, including ones that “discuss and present information about issues and current events which are important to the African American community,” which she publishes under the name “Carmen Caboom.” At some point prior to the launch of this suit, Jones claims that YouTube took down her “original ‘Carmen Caboom’ channel for purported nudity when” not a single video posted to the channel included any nudity whatsoever.

Meanwhile, plaintiff Carleste Newman declares that despite the fact that her videos similarly “contain no nudity, sexualized scenes or language, graphic depictions of sex or violence, drug abuse, or alcohol consumption,” YouTube has “applied ‘Restricted Mode’ to most of [her] videos, and has allowed only very limited monetization for some videos, without any explanation or rationale for doing so.” Newman asserts that “the sole reason that the defendants have acted in this fashion is that [they] discriminate against the plaintiffs and other persons similarly situated based on race.” In other words, it allegedly penalizes “videos created by an African American; videos related to issues and events of concern to the African American community, and videos viewed by large numbers of members of the African American community.” 

Aside from the bad optics of such alleged discrimination against the content of Black creators, the plaintiffs claim that there is a bigger problem here: YouTube’s acts amount to “racism, overt intentional and systematic.”

With the foregoing in mind, the plaintiffs set forth ten causes of action, asserting that the defendants have run afoul of California’s Unruh Civil Rights Act by denying them “full and equal accommodations or services” with “a motivating reason for [such] conduct” being the plaintiffs’ race. The plaintiffs also claim that YouTube and its parents are violating the First Amendment – which “prohibits a party from engaging in ‘state action’ that violates or harms a person’s right to engage in speech, association, expression, or other activity protected by the Amendment” – by “filtering, restricting, and blocking the plaintiffs’ speech and expressive conduct on YouTube … not based on the platform’s viewpoint neutral rules governing what content is and is not permissible, but on the race, identity or viewpoint of the plaintiffs.” 

Finally, they argue that while they abide by YouTube’s term of service, “including complying with [its] viewpoint neutral content based access rules,” YouTube is in breach of its own terms by “interfering with the contractual and legal rights of the plaintiffs and all persons similarly situated to access and use YouTube based in any way, part, or degree on their race, identity or viewpoint.” As such, the plaintiffs set forth an array of contract-specific causes of action, including breach of contract and breach of the implied covenant of good faith and fair dealing. 

“Despite a whole lot of ‘telling,’” the plaintiffs claim that YouTube and its parent companies “have made no attempt to ‘show’ that their actions do not discriminate based on the race, identity or viewpoint of the plaintiffs or the hundreds of millions of other users who fall victim to [their] discrimination,” arguing that “ifthe defendants truly believe that they are engaged in good faith, viewpoint neutral content regulation on YouTube, then [they] should produce the computer code and permit an expert review of that code … [and] explain to the plaintiffs, the Court, and the public why their prior admissions and other evidence of ‘targeting’ African Americans and members of other protected racial classifications under the law, are not true.” 

In the meantime, the four women have asked the court to certify their proposed class action to enable other similarly-situated individuals to join in the case. They are also seeking injunctive relief to bar “unlawful racial profiling and use of the user’s race, or other identity or viewpoint to filter, restrict, or block content, or otherwise deny the plaintiffs’ access or use of any services offered by Google/YouTube,” as well as monetary damages to be determined at trial. 

Still yet, the plaintiffs have asserted that they want a formal declaration from the court stating that, among other things, the plain language of sections 230(c)(1) and/or (2) “does not immunize the regulating, restricting or blocking of material based on the racial, or other identity or viewpoint of the user posting or viewing the video;” and that the “application of Section 230(c) in any way to permit and immunize race, sex, or other identity or viewpoint based profiling and regulation of content and access on YouTube is unconstitutional and violates the First Amendment.”  

In a statement made on the heels of the lawsuit’s filing, YouTube said that it enforces its terms of service and guidelines in a “neutral and consistent” manner and that its automated systems are “not designed to identify the race, ethnicity or sexual orientation of creators or viewers.”

Asked about the lawsuit during a Washington Post event last week, YouTube CEO Susan Wojcicki denied the claims, saying, “It is not like our systems understand race of any of those different demographics.” She added that YouTube works to ensure that “our machines have not by accident learned something that is not what we intended.” If YouTube does find that an algorithm is doing something it is not supposed to, the company retains it to ensure “that whatever that issue was has been removed from the training set of our machines.” 

*The case is Kimberly Carleste Newman, Lisa Cabrera, Catherine Jones, and Denotra Nicole Lewis v. Google LLC, YouTube LLC, and Alphabet Inc, 5:20-cv-04011 (N.D. Cal.).

related articles