Facebook engineers have proposed measures to suppress content on the platform that attempts to convince people to shift their viewpoints away from the left, as revealed by photographs of leaked internal documents and messages obtained by Project Veritas. The photos show that several prominent right-leaning Facebook pages were marked with code that appears to have handicapped the pages.
Facebook has long faced accusations of left-leaning bias, underscored by the overwhelmingly leftist leanings of its staff. While the company has maintained that personal biases do not seep into its content policing, the documents show the engineers repeatedly conflating right-leaning arguments, some perhaps crudely presented, with “hate speech” or “trolling.”
But it is not just folks who make political or cultural commentary a full-time job or hobby. It may be regular people just posting their thoughts about everything from abortion to marriage to homeschooling to the building of the “wall” on the southern border. Facebook may be limiting how many of your friends see your post in an effort to prevent you from convincing people of your view.
The problem is that Facebook’s definition of “hate speech” has been broadened to where Bible verses supporting traditional marriage, or about sin can now easily be flagged as “hate.” This then creates a process where that page is red flagged and is punished by the social media giant with fewer people being allowed to see those Bible posts or comments.
One of the engineers even mused that Facebook has to suppress not only “hate speech,” but also “content they consider on the perimeter of hate speech,” even though Facebook has acknowledged the company still struggles to define what hate speech is.
In the past that means surpassing rude or mean content or content that is just upsetting to the social media platform’s liberal users.
‘Red-pilling’ or Trolling?
The photos (pdf) were obtained by a Facebook contractor fired about year ago, according to Project Veritas, and include a September 2017 presentation by the company’s Data Science Manager Seiji Yamamoto and Chief Data Scientist Eduardo Arino de la Rubia.
The presentation documents what the authors consider “coordinated trolling” and how to counter it.
Trolling is a broad term that describes intentionally eliciting a negative response from someone, most commonly online.
The authors identified trolls with a variety of forms of conduct commonly deemed unacceptable online, such as harassment, doxxing (revealing somebody’s personal information), and falsely reporting content violations.
Yet one of the “destructive behaviors” described by the authors was also “red-pilling normies to convert them to their worldview.”
“Red-pilling,” a colloquialism commonly associated with the political right, originated as a reference to the 1999 movie “The Matrix” and describes a confrontation with shocking, hard-to-accept facts that force one to reevaluate one’s beliefs.
“Normie” is a term used on some online discussion platforms like 4chan to describe “an individual who is deemed to be boringly conventional or mainstream by those who identify themselves as nonconformists,” according to KnowYourMeme.com.
As an example of such “red-pilling,” the authors posted a link to the YouTube video “Why Social Justice is CANCER | Identity Politics, Equality & Marxism” by Lauren Chen, also known as “Roaming Millennial.”
In a video response to the document’s release, Chen said she was confused by the authors’ choice to single out her video, which she described as “super, super tame.”
“Essentially, I’m arguing that social justice is toxic because it promotes tribalism over individuality and because it chips away at the concept of equality of opportunity for individuals,” she said.
Chen acknowledged she picked a provocative title, yet she also started the video by explaining she doesn’t literally equate “social justice” with cancer. “It is frustrating that I would even need to explain things like hyperbole and metaphor,” she said.
It doesn’t appear Chen’s viewers have felt “trolled” either—the video had some 11,000 likes versus fewer than 500 dislikes as of Feb. 28.
She further took issue with the authors’ apparent belief that converting people to one’s worldview is objectionable.
“That’s essentially engaging in debate and discussing ideas,” she said. “How is that a troll behavior? Under what logic can you group someone who’s trying to convert someone to their worldview, which anyone does anytime they’re discussing a political issue they care about, how do you group that as being a troll? That is so far removed from what actual trolling is. I mean, are the presidential debates ‘troll behavior’?”
She then went a step further, questioning whether the authors considered her video “destructive” because it was, in fact, persuasive.
Punishments
Yamamoto and de la Rubia suggested establishing a “toxic meme cache” and blocking and suppressing images that match the “cache.”
They also recommended developing a program that could “predict” whether a user is a troll by, among other things, scouring the user’s language for words like “cuck, zucced, REEE, normie, IRL, lulz, Shadilay, etc.”—some of which are common slang terms used by some online communities.
They proposed targeting troll accounts with “drastically limited bandwidth for a few hours,” which would slow down Facebook’s functioning for the user, as well as logging the user out or redirecting the user to their Facebook home page every few minutes.
They also proposed to make it so that “comments and posts that [the users] spend time crafting will magically fail to upload.”
‘Action Deboost’
Other photographs from the insider show that some Facebook pages were marked with the code “SI (Sigma): !ActionDeboostLiveDistribution,” which the insider believed was to suppresses the distribution of live stream videos posted by those pages. The code was seen on pages belonging to right-leaning author and filmmaker Mike Cernovich, conservative comedian and commentator Steven Crowder, and right-leaning news site The Daily Caller. The insider said she checked several pages belonging to left-leaning figures and entities, such as the Young Turks and Colin Kaepernick, and found that they didn’t include the coding.
The insider’s photos “seem legitimate,” former senior Facebook engineer Brian Amerige told The Epoch Times via Facebook Messenger app. He was hesitant to trust Project Veritas, a right-leaning nonprofit, as a source, and said he hadn’t seen the “deboosting” technology with his own eyes. He opined, though, that “‘deboosting’ is probably happening one way or another (for both good and bad reasons).”
Facebook didn’t respond to a request for comment.
Hate Speech
Hate speech, according to Facebook, refers to derogatory statements based on someone’s “protected characteristics—race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, serious disease, or disability.”
“We all agree that hate speech needs to be stopped,” Yamamoto wrote in an internal Jan. 17, 2018, post, “but there’s quite a bit of content near the perimeter of hate speech that we need to address as well.”
However, the company has acknowledged that, regarding hate speech, “there is no universally accepted answer for when something crosses the line.”
“To some, crude humor about a religious leader can be considered both blasphemy and hate speech against all followers of that faith,” it stated in a 2017 blog post. “To others, a battle of gender-based insults may be a mutually enjoyable way of sharing a laugh.”
Ultimately, Facebook acknowledges that its content police force, which has tripled since last year to 30,000 strong, has to make a judgment call in each case.
Amerige described Facebook’s company climate as a political monoculture, in which “Facebook’s community standards are chaotically, almost randomly, enforced, with escalations and occasional reversals happening when the screw-ups are prominent enough to cause media attention.”
He said that during his time at the company, he tried to change the culture from within and even gained the attention of company leadership, but eventually reached an impasse on the issue of hate speech.
“Hate speech can’t be defined consistently and it can’t be implemented reliably, so it ends up being a series of one-off ‘pragmatic’ decisions,” he previously told The Epoch Times. “I think it’s a serious strategic misstep for a company whose product’s primary value is as a tool for free expression.”