Blackburn’s quest: Researching the darkest corners of the internet
Computer science assistant professor's work analyzes cyberbullying, zoombombing, hate speech
Life in the digital age has added one more certain thing to the old saying about death and taxes: People are going to be jerks on the internet.
Whether it鈥檚 an anonymous troll questioning your parentage or a propaganda campaign by a foreign power, the signal-to-noise ratio on social media has become much worse in recent years. That鈥檚 not even mentioning the hate-mongers, conspiracy theorists and outright liars who want their skewed views to become your views.
Assistant Professor Jeremy Blackburn, a faculty member in Watson College鈥檚 Department of Computer Science, has been researching 鈥渂ad actors鈥 online for more than 10 years. That journey has taken him to some dark places where outsiders fear to tread, but he hopes that by shining a light there, we can start to figure how to fix them.
鈥淚 don鈥檛 think the problems are new. They are fundamental human problems,鈥 Blackburn says. 鈥淲hat鈥檚 different is that it鈥檚 become a socio-technical problem rather than just a social problem. The internet doesn鈥檛 make people bad 鈥 it just enables them to be worse, and it enables them to find other people who are also bad.鈥
FROM GAMING TO SOCIAL MEDIA
Blackburn first became interested in computers while growing up in Florida, connecting with fellow users from around the world through massively multiplayer online role-playing games (MMORPGs) such as 鈥淯ltima Online.鈥 Players adopted sword-and-sorcery character avatars for quests to conquer kingdoms and battle monsters.
Because Blackburn and his friends were clever with programming code, they sometimes would find ways to cause chaos. One time, his 鈥渃lan鈥 built a virtual house in front of a key entry point and shot arrows from inside at other players who approached. Another trick, which landed them in the game鈥檚 鈥渏ail,鈥 involved killing a character and stealing the blueprints for a new kind of building being beta-tested.
Yeah, they weren鈥檛 exactly angels.
鈥淚f you did that kind of stuff in person during a Dungeons & Dragons game, you might get punched in the mouth,鈥 Blackburn says with a laugh. 鈥淏ut the fact that it was virtual enabled a whole different level of mischief.鈥
Like many teens who love coding, Blackburn headed to college 鈥 in his case, the University of South Florida (USF) in Tampa 鈥 with the intent to design computer games. His interests later shifted to the underlying technologies that make shared games possible, such as distributed systems that spread various components across multiple computers.
For his doctoral thesis 鈥 also at USF 鈥 he returned to the idea of bad behavior online by studying cheating in internet gaming, and that drew a direct path to the kind of research he does today.
While earning his degrees, Blackburn worked for more than a decade in private industry, including as principal developer at test-prep company Boson Software and as software architect at his own company, Pallasoft. He also spent three years as an associate researcher at Telefonica Research in Barcelona, Spain.
His time in academia 鈥 first at the University of Alabama at Birmingham and now at 黑料视频 鈥 has coincided with the proliferation and influence of mainstream platforms such as Facebook and Twitter as well as niche apps like Telegram, Parler, 4chan and Gab.
鈥淭hings have evolved away from blogs and similar sites in the past 10 years,鈥 he says. 鈥淧eople want interactive social media 鈥 they want to be able to engage with each other rather than just scream on a soapbox.鈥
In our polarized society, though, those back-and-forth interactions can get downright nasty.
TRACKING THE TROLLS
Blackburn is the co-founder of the International Data-driven Research for Advanced Modeling and Analysis (iDRAMA) Lab, which includes more than two dozen professors, PhD students and industry researchers from around the world.
In various configurations, iDRAMA members have studied nearly every social media platform, from dominant ones like Twitter to white supremacist havens such as Gab and 4chan. The only one they ignore is Facebook, because data collection from there has become increasingly unreliable.
The iDRAMA Lab has published research on QAnon, the rise in anti-Asian and anti-Semitic sentiments, the use of manipulated news images (also known as 鈥渇auxtography鈥), cyberbullying, misogyny, state-sponsored disinformation campaigns and more.
It鈥檚 a roundup of the worst that humanity has to offer, and sometimes the haters strike back. A recent 4chan post, for instance, claimed that Blackburn is 鈥渁 Hamas recruiter,鈥 and he鈥檚 received a few ominous threats over the years. (Luckily, nothing came of them.)
Blackburn fosters an atmosphere of camaraderie among his students and peers, welcoming open conversations so that no one feels overwhelmed by internet hate.
鈥淚f you don鈥檛 look at the content, you can鈥檛 really do research about it,鈥 he says, 鈥渂ut if you look at the content too much or too deeply 鈥 if you stare into the abyss a bit too long 鈥 you might fall into it. It鈥檚 hard walking that line, and I鈥檝e certainly had failures along the way.鈥
Gianluca Stringhini, an assistant professor at Boston University and co-founder of the iDRAMA Lab, praises Blackburn鈥檚 willingness to think outside of the boundaries of traditional computer science methods.
鈥淲hen Jeremy and I started working together, we realized that studying these emerging sociotechnical problems required techniques that don鈥檛 really fall under any of the established research methods in our fields,鈥 Stringhini says.
鈥淔ive years later, we are combining computer networks, security, graph analysis, psychology and other disciplines to paint a comprehensive picture of online weaponized information. Not many researchers would be comfortable doing that, but Jeremy has a unique vision and is not afraid of breaking with research norms.鈥
TURNING OVER THE ROCKS
Earlier this year, Blackburn received a five-year, $517,484 National Science Foundation CAREER Award for his project 鈥淭owards a Data-Driven Understanding of Online Sentiment.鈥 The CAREER Award supports faculty who have the potential to serve as future academic role models.
At the core of the project is devising a better way to train machine learning 鈥 which does most of the content moderation on social media platforms 鈥 about how to judge the offensiveness of images used in memes.
Currently, artificial intelligence software tries to determine if a particular image is bad or not, but Blackburn wants to take a trick from online gaming by presenting it two images and asking which is worse. The process is similar to the 鈥渕atchmaking鈥 system that puts gamers into groups of similar skills, not people who are 鈥1,000 times better or worse than you.鈥
鈥淚nstead of looking at images in isolation and making a judgment on that individual piece of content, it鈥檚 more like ordering them,鈥 he says. 鈥淲e鈥檙e not learning if something is racist or not 鈥 we鈥檙e learning which is more racist. Who knows what we鈥檒l find, but we鈥檙e convinced that it will lead to something interesting.鈥
Blackburn admits that he and his iDRAMA colleagues sometimes discuss whether their research is helping internet jerks to dig in deeper and evade future detection. Maybe if they didn鈥檛 turn over the rocks, the nasty critters underneath would just stay there and never come out.
As a computer scientist, though, Blackburn believes that learning more will be an important step toward curbing what has become a political and social menace. He contends it鈥檚 also a public health crisis: Online hate affects our mental well-being, and misinformation about COVID-19 has led to more deaths and hospitalizations.
鈥淲e have this insanely powerful, world-changing technology that鈥檚 been around for less than a generation,鈥 he says. 鈥淚 hope that we鈥檒l provide the knowledge and tools to become more resilient, more robust and less susceptible to this type of behavior, and to start figuring out ways to actively address it.鈥