Saturday, March 15, 2025, 2:51AM |  66°
MENU
Advertisement
A group of researchers listens to the findings of research into user moderation on Weibo, a popular Chinese social media platform, during a meeting of a group working to reduce online hate speech at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019, led by Kathleen Carley, the director for the Center for Computational Analysis of Social and Organizational Systems.  #TOL (Christian Snyder/Post-Gazette)
10
MORE

Can technology turn down the volume of online hate? Researchers at CMU are working to find an answer.

Christian Snyder/Post-Gazette

Can technology turn down the volume of online hate? Researchers at CMU are working to find an answer.

At Carnegie Mellon University, professors and students intent on toning down the online vitriol gained new motivation from the massacre a year ago and a mile from campus.

“I was actually driving through Squirrel Hill after having breakfast with friends,” on the morning of Oct. 27, 2018, said Geoff Kaufman, an assistant professor at CMU’s Human-Computer Interaction Institute. “A friend of mine lived just a block away from Tree of Life. So it was a pretty poignant day.”

He’s one of a cluster of researchers, each backed by talented students, dedicated to finding the right way to improve the tone on social media, which has increasingly been identified as a breeding ground for extremism and violence.

“If anything [Tree of Life] gave me more driving motivation to study issues of bias and inter-group violence,” he said.

At the time of the Tree of Life massacre, Kathleen M. Carley was already deep into research on the methods by which foreign powers and extremist groups exploit social media to influence elections and destabilize populations. They worm their way into existing online communities, then share disinformation and propaganda with others in the group, taking advantage of the credibility afforded them as members, said the director of CMU’s Center for Computational Analysis of Social and Organizational Systems.

“If you think it’s coming from your group, you start believing it,” she said.

Go to section

People who have come to believe one conspiracy theory often can be convinced of others, she said. Once they’re in, it’s hard to pull them out of conspiracy land.

“Facts rarely work” as a counter to conspiracy theories, she said, because believers “are operating emotionally” and are predisposed to discount anything that runs counter to those feelings. Plus, she said, for some people disinformation “is just plain fun” compared to cold reality.

Sometimes mocking a piece of disinformation is more effective than countering it with facts, she added.

She and other technologists have developed “bot hunters, troll hunters” and “ways of identifying memes,” she said. What they don’t have is a foolproof way to “identify who the bad guys are. … What are they trying to do with their messaging? Who is vulnerable, and how can we help those who are vulnerable?”

With a Knight Foundation grant, she recently launched the Center for Informed Democracy and Social Cyber Security — or IDeaS — which will try to address some of those questions.

In May, a CMU team including Mr. Kaufman published a paper showing that cleverly designed “CAPTCHA” systems — tasks you do to prove you’re not a robot — can subtly alter the mood of online commenters. Ask the would-be commenter to use the mouse to draw a smiley face before weighing in on a politically charged blog post, and you tend to get users “even more engaged with the topic, but expressing their opinions in a more positive way,” he said.


Yian Wang, a graduate student, discusses the findings of her research into user moderation on Weibo, a popular Chinese social media platform at Carnegie Mellon University on Monday, Sept. 23, 2019(Christian Snyder/Post-Gazette)

His team is experimenting with other tools that might change the tone on social media and in multi-player computer games, where bias against people who aren’t straight, white males often “runs rampant,” he said.

He is not confident, though, that this approach would work on Gab.com — the accused Tree of Life shooter’s favorite social media site — or on 8chan, where shooters reportedly posted prior to racially driven attacks in Poway, Calif., El Paso, Texas, and Christchurch, New Zealand.

“I’m not sure the approaches that we’ve used would be that effective in those spaces,” he said.

Some of Big Tech’s early attempts may have drawbacks.

Since mid-2018, some social media platforms have set standards for speech and tried a variety of methods — from banning violators to quarantining them in hard-to-find corners — to discourage violent and hateful speech.

“If you crack down, you may drive people away,” said Carolyn P. Rose, a professor at CMU’s Human-Computer Interaction Institute and its Language Technologies Institute. “It may look like you contained the problem. But you might actually cause a breakdown in communication between the two sides” of the ideological spectrum.



Extremists driven from Twitter to Gab or 8chan might have dramatically less exposure to other viewpoints, and might be immersed in more radical ideas, she said.

She’s using a database of online interactions to study the characteristics of civil conversations between people of different ideological viewpoints. If researchers can figure out what makes people from different parts of the political spectrum click with each other, then the algorithms that recommend new followers could be tweaked in ways that would connect people across ideological divides, rather than strengthening the walls of our echo chambers.

“If we foster exchange between people who align differently in their political views,” she said, “then it at least keeps the communication lines open.

“Companies like Google have this very much on their radar. They realize that there’s actually something to be gained by not just trying to eradicate the unsavory behaviors, but to actually foster positive behavior.”

Rich Lord: rlord@post-gazette.com or 412-263-1542

First Published: October 21, 2019, 11:00 a.m.

Advertisement
RELATED
SHOW COMMENTS (0)  
Join the Conversation
Commenting policy | How to Report Abuse
If you would like your comment to be considered for a published letter to the editor, please send it to letters@post-gazette.com. Letters must be under 250 words and may be edited for length and clarity.
Partners
Advertisement
Pittsburgh Steelers quarterback Russell Wilson (3) and Cleveland Browns quarterback Jameis Winston (5) embrace after an NFL football game, Sunday, Dec. 8, 2024, in Pittsburgh.
1
sports
Jason Mackey: Why are the Steelers waiting so long for Aaron Rodgers? There's another option
Firefighters and officers respond to a collapsed porch roof on Friday, March 14, 2025, in Oakland. Earlier, during a college party, the roof caved in with over a dozen people on and below the structure. Multiple injuries were reported, and the porch was condemned.
2
local
WATCH: Several injured after roof collapsed on Oakland building
The Social Security Administration Building at 6117 Penn Circle North in East Liberty Wednesday, Jan. 2, 2019 in Pittsburgh.
3
news
Social Security Administration to begin withholding full benefits from overpaid recipients
Jeff Capel, head coach of Pitt looks on against Syracuse at the NCAA men’s basketball game on Tuesday Feb. 18, 2025 at Petersen Event Center in Pittsburgh, Pa.
4
sports
Pitt men's basketball will decline invitations to any postseason tournaments
The National Energy Technology Laboratory in the South Hills. The research lab's future has been clouded with uncertainty after about 55 probationary employees were summarily fired via a midnight e-mail on Valentines Day.
5
business
The national lab in Pittsburgh's backyard is a place for innovation — and worry
A group of researchers listens to the findings of research into user moderation on Weibo, a popular Chinese social media platform, during a meeting of a group working to reduce online hate speech at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019, led by Kathleen Carley, the director for the Center for Computational Analysis of Social and Organizational Systems. #TOL (Christian Snyder/Post-Gazette)  (Christian Snyder/Post-Gazette)
Yian Wang, a graduate student studying at CMU, discusses the findings of her research into user moderation on Weibo, a popular Chinese social media platform, during a meeting of a group working to reduce online hate speech at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019, led by Kathleen Carley, the director for the Center for Computational Analysis of Social and Organizational Systems. #TOL (Christian Snyder/Post-Gazette)  (Christian Snyder/Post-Gazette)
A data table on the projector screen shows content of social media posts made in China, part of a research experiment by Yian Wang, which she presented during a meeting of a group working to reduce online hate speech at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019, led by Kathleen Carley, the director for the Center for Computational Analysis of Social and Organizational Systems. #TOL (Christian Snyder/Post-Gazette)  (Christian Snyder/Post-Gazette)
Geoff Dobson, a computer engineer at Carnegie Mellon University's Software Engineering Institute, listens to a presentation during a meeting of a group working to reduce online hate speech at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019, led by Kathleen Carley, the director for the Center for Computational Analysis of Social and Organizational Systems. Mr. Dobson works to design and deliver cyber warfare exercise to the U.S. Department of Defense. #TOL (Christian Snyder/Post-Gazette)  (Christian Snyder/Post-Gazette)
Vanessa Kolb, a graduate student at CMU, listens to a presentation during a meeting of a group working to reduce online hate speech at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019, led by Kathleen Carley, the director for the Center for Computational Analysis of Social and Organizational Systems. #TOL (Christian Snyder/Post-Gazette)  (Christian Snyder/Post-Gazette)
Aman Tyagi, a doctoral candidate at CMU, listens to a presentation during a meeting of a group working to reduce online hate speech at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019, led by Kathleen Carley, the director for the Center for Computational Analysis of Social and Organizational Systems. #TOL (Christian Snyder/Post-Gazette)  (Christian Snyder/Post-Gazette)
Kathleen Carley listens to a student’s presentation during a meeting of a group she leads at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019. Ms. Carley is the director for the Center for Computational Analysis of Social and Organizational Systems, and her team is investigating ways to reduce hate speech online. #TOL (Christian Snyder/Post-Gazette)  (Christian Snyder/Post-Gazette)
Matt Babcock, a post-doctoral associate at CMU’s Institute for Software Research, asks a question during a meeting of a group working to reduce online hate speech at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019, led by Kathleen Carley, the director for the Center for Computational Analysis of Social and Organizational Systems. #TOL (Christian Snyder/Post-Gazette)  (Christian Snyder/Post-Gazette)
Joshua Uyheng, center, discusses his research into the 2019 Philippine elections during a meeting of a group working to reduce online hate speech at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019, led by Kathleen Carley, the director for the Center for Computational Analysis of Social and Organizational Systems. The election was plagued by trolls and disinformation, and Mr. Uyheng examined the public political response to that disinformation. #TOL (Christian Snyder/Post-Gazette)  (Christian Snyder/Post-Gazette)
Joshua Uyheng, bottom left, looks at the class before giving a presentation on his research into the 2019 Philippine elections during a meeting of a group working to reduce online hate speech at Carnegie Mellon University’s Center for Informed Democracy and Social Cyber Security on Monday, Sept. 23, 2019, led by Kathleen Carley, the director for the Center for Computational Analysis of Social and Organizational Systems. The election was plagued by trolls and disinformation, and Mr. Uyheng examined the public political response to that disinformation. #TOL (Christian Snyder/Post-Gazette)  (Christian Snyder/Post-Gazette)
Christian Snyder/Post-Gazette
Advertisement
LATEST local
Advertisement
TOP
Email a Story