main navigation
my pace

Adam Klein | PACE UNIVERSITY

News & Events

Sort/Filter

Filter Newsfeed

News Item

Associated Press featured associate professor of communication studies Adam Klein in "Trump, social media, right-wing news stir up antifa scares"

09/24/2020

Associated Press featured associate professor of communication studies Adam Klein in "Trump, social media, right-wing news stir up antifa scares"

Adam Klein, an associate professor of communication studies at Pace University, analyzed social media posts by far-right extremists and antifascist activists leading up to the Charlottesville rally three years ago. He found antifascists have a “pretty loose” communication network.“You don’t get the sense online that there is an organization as much as there are some prominent (social media) accounts associated with antifa,” he said.

Read the full Associated Press article.

News & Events

Sort/Filter

Filter Newsfeed

News Item

The Ridgefield Press featured Dyson Professor Adam G. Klein's piece "Social networks aim to erase hate but miss the target on guns"

07/21/2020

The Ridgefield Press featured Dyson Professor Adam G. Klein's piece "Social networks aim to erase hate but miss the target on guns"

(The Conversation is an independent and nonprofit source of news, analysis and commentary from academic experts.)

Adam G. Klein, Pace University

(THE CONVERSATION) As Facebook faces down a costly boycott campaign demanding the social network do more to combat hate speech, CEO Mark Zuckerberg recently announced plans to ban a “wider category of hateful content in ads.” Twitter, YouTube and Reddit have also taken additional steps to curtail online hate, removing several inflammatory accounts.

But as social networks refine their policies and update algorithms for detecting extremism, they overlook a major source of hateful content: gun talk.

As a researcher of online extremism, I examined the user policies of social networks and found that while each address textbook forms of hate speech, they give a pass to the widespread use of gun rhetoric that celebrates or promotes violence.

In fact, the word “gun” appears but once in Facebook’s policy on “Violence and incitement” to bar the manipulation of images to include a gun to the head. And neither “guns” nor “firearms” are mentioned in Twitter’s policy on “Glorifications of violence,” or YouTube’s guidelines on “Violent or graphic content” or within any of these networks’ rules on hate speech.

Gun talk as a threat

Gun references have become prevalent in social media dialogues involving the nationwide protests over racial injustice, police reform and the Black Lives Matter movement.

On Facebook, a group called White Lives Matter shared a post that reads, “Don’t allow yourself or your property to become a victim of violence. Pick up your weapon and defend yourself.” Another user posted the picture of a handgun beneath the message, “I never carried a weapon, never needed it, but I have changed my mind and will apply for what I deem necessary to handle things my way … Tired of all these BLM idiots looters.”

While nearly every social network works to identify and prohibit violent speech, gun groups have managed to evade censure. One such Facebook community gleefully taunts protesters with the prospect of retaliation by firearm. They share a meme of a stack of bullets surrounded by the caption, “If you defund the police you should know, I don’t own any rubber bullets.”

News & Events

Sort/Filter

Filter Newsfeed

News Item

The Washington Post featured Dyson College of Arts and Sciences Professor Adam Klein in "Scant evidence of antifa shows how sweeping the protests for racial justice have become"

06/15/2020

The Washington Post featured Dyson College of Arts and Sciences Professor Adam Klein in "Scant evidence of antifa shows how sweeping the protests for racial justice have become"

Antifa no longer exists, at least not in the form in which it has featured on the president’s Twitter feed or propaganda outlets, including the state-backed Russian channel RT, said Adam Klein, a professor of communication studies at Pace University who has analyzed antifa’s confrontation with white supremacists in Charlottesville in 2017.

“I’m sure there are people on the streets today who see themselves as anti-fascist,” Klein said. “But any kind of broader organization has been pretty dormant.”

On the most active antifa accounts on social media, Klein said, there is commentary about recent protests and occasional attempts to unmask far-right actors, but there have been “no calls for destruction or fighting police.”

Read the full Washington Post article.

News & Events

Sort/Filter

Filter Newsfeed

News Item

Wired featured Dyson Professor Adam Klein in "The World Was Primed for Protest Conspiracy Theories"

06/04/2020

Wired featured Dyson Professor Adam Klein in "The World Was Primed for Protest Conspiracy Theories"

I think the hard truth is that among the pain, anger, and protest, there are some who are taking their justified rage to dangerous places, and there are some who are taking advantage of that pain for their own ends,” says Adam Klein, who studies propaganda and extremism at Pace University. “It is easy, and perhaps politically expedient, for some to find a boogeyman in all this to blame it on.”

Read the full Wired article.

News & Events

Sort/Filter

Filter Newsfeed

News Item

Washington Post featured Dyson Professor Adam Klein in "Country on edge after a weekend of protests against police brutality"

06/01/2020

Washington Post featured Dyson Professor Adam Klein in "Country on edge after a weekend of protests against police brutality"

“The intention is to find the boogeyman,” said Adam Klein, a professor of communication studies at Pace University who has studied the movement’s confrontation with far-right activists in the lead-up to the 2017 “Unite the Right” rally in Charlottesville.

Read the full Washington Post article.

News & Events

Sort/Filter

Filter Newsfeed

News Item

"Outside the Beltway" featured Pace University's Communication Studies professor Adam G. Klein in "Americans Want to Ban Hate Speech But Can't Define It"

03/21/2019

"Outside the Beltway" featured Pace University's Communication Studies professor Adam G. Klein in "Americans Want to Ban Hate Speech But Can't Define It"

Two-thirds want social media platforms to ban harassment and racist, sexist, and other offensive speech. 

Adam G. Klein, a Communication Studies professor at Pace University and the author of Fanaticism, Racism, and Rage Online: Corrupting the Digital Sphere, published an interesting essay two days before the Christchurch massacre titled “FEAR, MORE THAN HATE, FEEDS ONLINE BIGOTRY AND REAL-WORLD VIOLENCE.”

His setup:

When a U.S. senator asked Facebook CEO Mark Zuckerberg, “Can you define hate speech?” it was arguably the most important question that social networks face: how to identify extremism inside their communities.

Hate crimes in the 21st century follow a familiar pattern in which an online tirade escalates into violent actions. Before opening fire in the Tree of Life synagogue in Pittsburgh, the accused gunman had vented over far-right social network Gab about Honduran migrants traveling toward the U.S. border, and the alleged Jewish conspiracy behind it all. Then he declared, “I can’t sit by and watch my people get slaughtered. Screw your optics, I’m going in.” The pattern of extremists unloading their intolerance online has been a disturbing feature of some recent hate crimes. But most online hate isn’t that flagrant, or as easy to spot.

As I found in my 2017 study on extremism in social networks and political blogs, rather than overt bigotry, most online hate looks a lot like fear. It’s not expressed in racial slurs or calls for confrontation, but rather in unfounded allegations of Hispanic invaders pouring into the country, black-on-white crime or Sharia law infiltrating American cities. Hysterical narratives such as these have become the preferred vehicle for today’s extremists – and may be more effective at provoking real-world violence than stereotypical hate speech.

So far, so good. Klein observes,

[S]ocial networks have the unique capacity to turn down the volume on intolerance if they determine that a user has in fact breached their terms of service. For instance, in April 2018, Facebook removed two pages associated with white nationalist Richard Spencer. A few months later, Twitter suspended several accounts associated with the far-right group The Proud Boys for violating its policy “prohibiting violent extremist groups.”

Still, some critics argue that the networks are not moving fast enough. There is mounting pressure for these websites to police the extremism that has flourished in their spaces, or else become policed themselves. A recent Huffpost/YouGov survey revealed that two-thirds of Americans wanted social networks to prevent users from posting “hate speech or racist content.”

As I’ve noted many times over the years, most recently in my post-Christchurch essay, “Media, Free Speech, and Violent Extremists,” I both support de-platforming those who foment violence and fear how government and/or the decisionmakers at social media sites might go about doing so.

The public definitely wants social media companies to block more content:

But what any of this entails is debatable, I’d argue.

What’s harassment, for example? Obviously, spewing racist insults against members of minority groups, sexist attacks against women, and the like qualify. But what about the swarming attacks that occur when high-follower-count people quote tweet people with whom they disagree, launching hundreds if not thousands of @’s against someone? Does it matter if that someone is a public figure or otherwise powerful? Is “dead-naming” a transgender individual harassment? Twitter says it is and I tend to agree. But let’s not pretend that this doesn’t stifle debate on a hotly-contested issue that we’re still very much struggling to come to terms with as a society.

It’s hard to make a case for the virtue of “spreading conspiracy theories or false information.” But who gets to says what’s a conspiracy or what’s false? Is it verboten to talk about UFOs and Area 51? How about the alleged Trump pee tape? Allegations that the Trump campaign and/or administration are tools of the Russian government? Apparently, we’ve decided that anti-vax lunacy must be banned. With about global warming denialism?

“Hate speech or racist content” combines the problems of both of the previous categories. There are some groups it’s okay to hate, right? Nazis and other fascists are the most obvious example. Surely, we’re not going to ban people who say they hate Nazis? Similarly, it would be ironic, indeed, to ban denunciation of the Ku Klux Klan or other hate groups as hate speech.

Indeed, the public that is overwhelmingly in favor of banning it doesn’t actually agree on what it is they want to ban:

I’m something of a free speech absolutist and don’t think any words are inherently “hateful” or “offensive.” Context always matters.

For as long as I can remember, black Americans have been trying to reclaim “nigger” and its variants. While he eventually came to regret it, Richard Pryor did it for much of his career and other black comics—most of them, actually—have followed suit. More recently, the gay community has done the same, reclaiming slur words that were frequently hurled against them, most notably “queer.” Surely, it’s neither hateful nor offensive in those contexts.

Statements like “transgender people have a mental disorder” or “homosexuality is a sin” are in a different category. Expressed earnestly, they’re almost inherently hurtful and offensive. Yet both were mainstream beliefs well into my adult lifetime. And, it seems to me, debating these topics openly has moved the needle quite a bit in the right direction.

The notion that “undocumented immigrants should be imported” is offensive, let alone hateful, is bizarre. What if it were phrased, “the United States should enforce its laws” would it be? Now, more inflammatory statements—oh, “most Mexican immigrants are rapists” or “we’re being invaded by people from shithole countries”–could certainly be construed as hateful or offensive. But I’m not sure that exposing those ideas to rational discourse isn’t still the best strategy.

Some of the other examples, such as “all white people are racist” or “America is an evil country” or “police are racist,” strike me as almost dangerous to include on the list. Those are matters of political opinion, all of which are ripe topics for healthy debate. The notion that they should be banned from any platform is highly problematic.

Read the article.

News & Events

Sort/Filter

Filter Newsfeed