For a better experience, click the Compatibility Mode icon above to turn off Compatibility Mode, which is only for viewing older websites.

Interview with Professor Asta Zelenkauskaitė, Author of Creating Chaos Online

October 11, 2024

We caught up with Drexel Professor Asta Zelenkauskaitė to discuss her book Creating Chaos Online: Disinformation and Subverted Post-Publics, as well as the dangers of fake news, and the importance of making sense of information. 

Q: Your recent book, Creating Chaos Online, delves into disinformation in online settings and how it can “instill doubt rather than clarity,” specifically regarding Russian trolling online. How did you get into this area of research and what inspired you to write this book?  

My work focuses on user-generated content, specifically online comments in news portals. When I started to systematically analyze large volumes of comments, I noticed that not all participation online is the same. This gave birth to information warfare perspectives, i.e., how these spaces can be weaponized by various actors, given that online is accessible to everyone, regardless of their intentions. This work, published open access in the journal Social Media + Society (SM+S), shows the typology of interactions online and identifies spaces where weaponization can take place[1].

During my research, I started looking at Russian trolling as a case study in geopolitical contexts that were geographically remote from the U.S., mainly by analyzing Lithuanian news portal comments, as Russian trolling threats were present in the Baltic region.

We realized that Lithuanian news portals released educational news stories that warned that this phenomenon of Russian trolling was about the users who come to the online spaces that are defined by the ethos of openness—where anyone can post—and where people are encouraged to engage in what’s known as a “democratic deliberation.” We were seeing that some users flood these spaces by introducing Soviet nostalgia and weakening trust in institutions like NATO or political leaders who are critical of Russia.

At the time, I did not know that my work would precede the U.S. case of Russian trolling. Along with my Ph.D. student—Brandon Niezgoda—we published an article about these practices[2], which initially seemed to be locally grounded and relevant to people in Lithuania.

I could not have imagined the Russian trolling phenomenon would expand and have connections to interference in the U.S. presidential elections. Thus, given that Russian trolling emerged in the U.S., I decided to empirically analyze how Russian trolling is perceived and manifesting in a comparative way.

Q: Are we seeing some of the same issues here in the U.S., particularly around the upcoming presidential election?

Political division was one of the findings that was evident in the U.S. online comments (both in right and left leaning media) published in my book, particularly exploiting narratives that resonate with the voters on the right or the left.

At that time, blaming Hilary Clinton, blaming institutions that investigate Russian trolling, were examples of such political division. While I analyzed online comments as spaces, it is important not to underestimate information propagation through a range of online platforms, especially as their content moderation practices might be different (e.g., X, formerly Twitter).

Q: Do you have any tips to help people make sense of online content and identify the truth from disinformation?

Media literacy, information literacy, and social literacy are three types of theoretical frameworks that encompass educational approaches to making sense of the ever-changing online spaces and threats associated with them, otherwise known as pre-bunking techniques: when we understand the phenomena sufficiently to be able to make sense of it and not fall into traps that might sound reasonable.

And typically, in disinformation context—it is about instilling doubt rather than providing misinformation. We hear that most of the time it is about getting the sources right. Yet, in the current online media sphere, it is about understanding why the message wanted to advocate for one issue or the other. For example, why are messages that claim that “Russians are blamed for everything” propagated in the U.S. online comments? 

Q: Lastly, your books is freely available to read online via open access. Why did you decide to publish your research book open access as well as available in print? Can free and open access to research and information help curb the spread of fake news?

Most peer-reviewed research is restricted by paywalls, which makes it hard for [the common layperson] to get access to and read such literature. Plus, [research is often] more specialized and detailed and follows philosophy of science approaches that are more familiar to researchers who have training in approaches like, “how do we know that we know.”

I believe that open science can be one of the steps to creating conversations with the broader public—providing access to research means everyone can be better equipped with information.

 

About the Author

Asta Zelenkauskaitė, PhD, is professor of communication at Drexel University and is part of Drexel’s Center for Science, Technology & Society. Her research focuses around emergent practices online by bridging multidisciplinary approaches drawn from social science tradition, Communication, Information Science and Linguistics. Her research focuses on the ways in which online interaction can create new spaces and practices for their users. Her work is interested in societal challenges of information mistrust and post-truth and the way such inauthentic information can be uncovered. She is focusing in the changes that social media bring to mass media landscape by studying these phenomena from a multi-method approach from macro and micro approaches.

She is the author of Creating Chaos Online (University of Michigan Press, 2022).

 


[1] Zelenkauskaite, A., & Balduccini, M. (2017). “Information warfare” and online news commenting: Analyzing forces of social influence through location-based commenting user typology. Social Media+ Society3(3), 2056305117718468.

[2] Zelenkauskaite, A., & Niezgoda, B. (2017). “Stop Kremlin trolls:” Ideological trolling as calling out, rebuttal, and reactions on online news portal commenting. First Monday.