main navigation
my pace

The Conversation | PACE UNIVERSITY

News & Events

Sort/Filter

Filter Newsfeed

News Item

The Conversation - Indonesia (via the Good Men Project) featured Dyson Professor Adam Klein’s column "How To Fight Holocaust Denial in Social Media – With the Evidence of What Really Happened"

01/21/2021

The Conversation - Indonesia (via the Good Men Project) featured Dyson Professor Adam Klein’s column "How To Fight Holocaust Denial in Social Media – With the Evidence of What Really Happened"

One in four American millennials believe the Holocaust was exaggerated or entirely made up, according to a recent national survey that sought to find out what young adults know about the genocide of nearly 6 million Jews at the hands of Nazis some 80 years ago.

That startling statistic was cited as one of the main reasons that Facebook CEO Mark Zuckerberg decided in October to finally ban Holocaust denial across the social network. Denying the Holocaust ever happened is an enduring form of anti-Semitic propaganda that attempts to deny or minimize the atrocities committed by the Nazis against the Jews during World War II.

Read the full Conversation article.

News & Events

Sort/Filter

Filter Newsfeed

News Item

The Conversation featured Dyson Professor Adam Klein's piece "How to fight Holocaust denial in social media – with the evidence of what really happened"

12/04/2020

The Conversation featured Dyson Professor Adam Klein's piece "How to fight Holocaust denial in social media – with the evidence of what really happened"

One in four American millennials believe the Holocaust was exaggerated or entirely made up, according to a recent national survey that sought to find out what young adults know about the genocide of nearly 6 million Jews at the hands of Nazis some 80 years ago.

That startling statistic was cited as one of the main reasons that Facebook CEO Mark Zuckerberg decided in October to finally ban Holocaust denial across the social network. Denying the Holocaust ever happened is an enduring form of anti-Semitic propaganda that attempts to deny or minimize the atrocities committed by the Nazis against the Jews during World War II.

Following Facebook’s lead, Twitter announced it, too, would remove any posts that denied the history of the Holocaust, though CEO Jack Dorsey appeared to contradict that policy at a Senate hearing weeks later.

Holocaust deniers have continued to emerge in social media, and perhaps predictably, many have migrated to less restrictive sites like Parler, where hashtags like #HolocaustNeverHappened and #HolocaustIsALie are widespread. “If you want Holocaust denial, hey, Parler is going to be great for you,” Bill Gates recently said of the social network.

While some tech companies address the rise in Holocaust revisionism, and others leave the door open, social networks have played an unwitting role in helping to distort the memory of these horrific events. But as a scholar who studies online extremism, I believe that same community could do more to protect Holocaust remembrance by highlighting the digitized accounts of those who lived through it.

Read the full Conversation article.

News & Events

Sort/Filter

Filter Newsfeed

News Item

The Conversation featured Professor Brenna Hassinger-Das' co-written article "3 Year Olds Find YouTube Better for Learning"

11/30/2020

The Conversation featured Professor Brenna Hassinger-Das' co-written article "3 Year Olds Find YouTube Better for Learning"

Young kids believe that YouTube videos are better for learning than TV shows or videos created on a researcher’s smartphone. They also view people in YouTube videos to be more real than those on TV but less real than those featured in a researcher-created smartphone video. These are the major findings from a pre-COVID-19 study conducted in U.S. children’s museums in 2019. Brenna Hassinger-Das is Assistant Professor of Psychology, Pace University.

Read the full Conversation article.

News & Events

Sort/Filter

Filter Newsfeed

News Item

The Ridgefield Press featured Dyson Professor Adam G. Klein's piece "Social networks aim to erase hate but miss the target on guns"

07/21/2020

The Ridgefield Press featured Dyson Professor Adam G. Klein's piece "Social networks aim to erase hate but miss the target on guns"

(The Conversation is an independent and nonprofit source of news, analysis and commentary from academic experts.)

Adam G. Klein, Pace University

(THE CONVERSATION) As Facebook faces down a costly boycott campaign demanding the social network do more to combat hate speech, CEO Mark Zuckerberg recently announced plans to ban a “wider category of hateful content in ads.” Twitter, YouTube and Reddit have also taken additional steps to curtail online hate, removing several inflammatory accounts.

But as social networks refine their policies and update algorithms for detecting extremism, they overlook a major source of hateful content: gun talk.

As a researcher of online extremism, I examined the user policies of social networks and found that while each address textbook forms of hate speech, they give a pass to the widespread use of gun rhetoric that celebrates or promotes violence.

In fact, the word “gun” appears but once in Facebook’s policy on “Violence and incitement” to bar the manipulation of images to include a gun to the head. And neither “guns” nor “firearms” are mentioned in Twitter’s policy on “Glorifications of violence,” or YouTube’s guidelines on “Violent or graphic content” or within any of these networks’ rules on hate speech.

Gun talk as a threat

Gun references have become prevalent in social media dialogues involving the nationwide protests over racial injustice, police reform and the Black Lives Matter movement.

On Facebook, a group called White Lives Matter shared a post that reads, “Don’t allow yourself or your property to become a victim of violence. Pick up your weapon and defend yourself.” Another user posted the picture of a handgun beneath the message, “I never carried a weapon, never needed it, but I have changed my mind and will apply for what I deem necessary to handle things my way … Tired of all these BLM idiots looters.”

While nearly every social network works to identify and prohibit violent speech, gun groups have managed to evade censure. One such Facebook community gleefully taunts protesters with the prospect of retaliation by firearm. They share a meme of a stack of bullets surrounded by the caption, “If you defund the police you should know, I don’t own any rubber bullets.”

News & Events

Sort/Filter

Filter Newsfeed

News Item

"The Conversation" featured Pace University's Professor of Communication Studies Adam G. Klein's piece "Fear, more than hate, feeds online bigotry and real-world violence"

11/20/2018

"The Conversation" featured Pace University's Professor of Communication Studies Adam G. Klein's piece "Fear, more than hate, feeds online bigotry and real-world violence"

When a U.S. senator asked Facebook CEO Mark Zuckerberg, “Can you define hate speech?” it was arguably the most important question that social networks face: how to identify extremism inside their communities.

Hate crimes in the 21st century follow a familiar pattern in which an online tirade escalates into violent actions. Before opening fire in the Tree of Life synagogue in Pittsburgh, the accused gunman had vented over far-right social network Gab about Honduran migrants traveling toward the U.S. border, and the alleged Jewish conspiracy behind it all. Then he declared, “I can’t sit by and watch my people get slaughtered. Screw your optics, I’m going in.” The pattern of extremists unloading their intolerance online has been a disturbing feature of some recent hate crimes. But most online hate isn’t that flagrant, or as easy to spot. 

As I found in my 2017 study on extremism in social networks and political blogs, rather than overt bigotry, most online hate looks a lot like fear. It’s not expressed in racial slurs or calls for confrontation, but rather in unfounded allegations of Hispanic invaders pouring into the country, black-on-white crime or Sharia law infiltrating American cities. Hysterical narratives such as these have become the preferred vehicle for today’s extremists – and may be more effective at provoking real-world violence than stereotypical hate speech.

The ease of spreading fear

On Twitter, a popular meme traveling around recently depicts the “Islamic Terrorist Network” spread across a map of the United States, while a Facebook account called “America Under Attack” shares an article with its 17,000 followers about the “Angry Young Men and Gangbangers” marching toward the border. And on Gab, countless profiles talk of Jewish plans to sabotage American culture, sovereignty and the president. 

While not overtly antagonistic, these notes play well to an audience that has found in social media a place where they can express their intolerance openly, as long as they color within the lines. They can avoid the exposure that traditional hate speech attracts. Whereas the white nationalist gathering in Charlottesville was high-profile and revealing, social networks can be anonymous and discreet, and therefore liberating for the undeclared racist. That presents a stark challenge to platforms like Facebook, Twitter and YouTube.

Fighting hate

Of course this is not just a challenge for social media companies. The public at large is facing the complex question of how to respond to inflammatory and prejudiced narratives that are stoking racial fears and subsequent hostility. However, social networks have the unique capacity to turn down the volume on intolerance if they determine that a user has in fact breached their terms of service. For instance, in April 2018, Facebook removed two pages associated with white nationalist Richard Spencer. A few months later, Twitter suspended several accounts associated with the far-right group The Proud Boys for violating its policy “prohibiting violent extremist groups.” 

Still, some critics argue that the networks are not moving fast enough. There is mounting pressure for these websites to police the extremism that has flourished in their spaces, or else become policed themselves. A recent Huffpost/YouGov survey revealed that two-thirds of Americans wanted social networks to prevent users from posting “hate speech or racist content.”

Read the full article.

News & Events

Sort/Filter

Filter Newsfeed