Skip to main content
Research

Kate Starbird's Research Group Archive

This page contains an archive of the past five years of Directed Research Groups led by Professor Starbird. View her currently offered DRGs »


Spring 2025

“Called It”: Prediction and Sensemaking in the 2024 Election

Instructors: Nina Lutz and Kate Starbird

During emergent and high stakes events social media is a palace where people come together to make sense of uncertainty, speculate on what is or will occur, and gain community. "Called It" will be a DRG in Spring 2025 quarter where we analyze a dataset of cross-platform narratives and instances of people "calling it" during the course of the 2024 Presidential Election Season -- from aggregators of betting markets asking you to bet on who would win certain swing states to astrologers analyzing natal charts of candidates.

In this DRG, students will learn how to qualitatively code media from X and Tiktok. Each week we will meet for arbitration and to discuss the data.

Students who stick with the project and complete additional tasks may have the opportunity for authorship on an academic paper.  

The timing for the DRG is TBD but will be based on this application form and the accepted students. Some students may be continuing from Winter, where we developed the codebook to be used in this analysis. 

Questions? Email Nina Lutz, PhD Student, at ninalutz@uw.edu 


Winter 2025

“Called It”: Prediction and Sensemaking in the 2024 Election

Instructors: Nina Lutz and Kate Starbird

We will curate and begin analysis of a dataset of cross platform narratives and instances of people "Calling It" during the course of the 2024 Presidential Election Season -- from calling Senate races to Biden stepping down to astrologers predicting Harris would win or tarot cards saying RFK and Trump would be co-presidents. We want to gather, curate this dataset, and set the stages for analysis of it. We want to understand -- what role did prediction play in sensemaking across different online communities and how did predictive trends and tactics vary across different interfaces and communities?
Roles will involve developing queries and computationally and manually collecting and curating data for analysis, as well as thematic analysis and beginning some inductive coding of the data. 

Students who stick with the project and complete additional tasks will have the opportunity for authorship on an academic paper.   

This DRG is may continue in Spring 2025 quarter as well, pending research progress and needs!

This DRG has some returning students from the election and border crisis work, but we seek about 2 new students! 


Autumn 2024

Dehumanizing and Problematic Imagery about Latin American* Migration in the 2024 Presidential Election

Instructors: Nina Lutz and Kate Starbird

This is our third and last iteration of this DRG as part of an ongoing project. 

This DRG aims to develop methods, research questions, literature reviews, and tools to analyze rumors in real time during the 2024 U.S. Presidential Election Cycle. In particular, this DRG will focus on visual media (memes, short and long-form videos, photos, infographics, etc) during the 2024 Primary Elections. To limit our problem space, we will center questions on visual media that aims to dehumanize and spread rumors about and within intersectionally marginalized migration populations, particularly Latin American migrants and refugees. 

Students will first be acquainted with qualitative coding as a method and visual research methodologies – how to consider visual media (TikTok videos and images) as data and how to annotate and analyze them systematically. Then, students will contribute to qualitatively coding media and discuss the media weekly. 

Our goal is that each research study will be published as a paper at an academic conference. Already, 2 students from this DRG have published posters in academic conferences from this project and more publications are underway, 

In fall, we will only run the qualitative version of this DRG, where students will be qualitatively coding media and discussing it weekly. Some students will also assist with related literature reviews and research on current events and policy. 

Spanish-speaking students, including freshmen, are especially encouraged to apply! 

* We will for this DRG focus on migration and refugee populations, particularly from Latin America (including Haiti and the Caribbean Islands) but open to other migrations based on real-world events (ie, individuals from the Middle East). 


Autumn 2024

Public Communication of Election Rumor Research on Social Media

Instructors: Kate Starbird, Nina Lutz (HCDE PhD student), Danielle Tomson (CIP Research Program Manager), Rachel Moran (CIP Senior Research Scientist)

This fall, the Center for an Informed Public has a team of graduates, undergraduates, professors, and research scientists conducting research into rumors about U.S. elections. 

This DRG seeks students with expertise in qualitative analysis and social media content creation for a “research-through-design” project related to the 2024 election. This group will work in collaboration with a larger team to design, implement, and evaluate a content-production approach for communicating about election-related rumors on TikTok. Students in the DRG will (1) work to translate insights from our broader “rapid research” project (which analyzes rumors about election administration) into short-form video content, and (2) evaluate the efficacy of those videos in helping to mitigate the spread and/or impact of false rumors.
 
We are looking for students with strong communication skills, experience with the human centered design process, aptitude for qualitative research, and an interest in election processes and voting. Social media power users and students with experience creating short form videos are especially encouraged to apply. 


Spring 2024

Dehumanizing and Problematic Imagery about Latin American* Migration in the 2024 Presidential Election

Run by: Nina Lutz, PhD Student, HCDE
Supervisor: Professor Kate Starbird

This DRG aims to develop methods, research questions, literature reviews, and tools to analyze rumors around harmful rhetoric at the US-Mexico land border. In particular, this DRG will focus on visual media (memes, short and long-form videos, photos, infographics, etc) that aims to dehumanize and spread rumors about and within intersectionally marginalized migration populations, particularly Latin American migrants and refugees at the US-Mexico land border. 

Students will first work together to create a literature review of related work. Then they will divide into two groups to design and embark upon either a primarily qualitative or primarily quantitative research study. Students will have an opportunity to continue this research in Fall of 2024 to complete the research study and to be on subsequent publications from this research.

Capacity: 8 students 

Student Requirements:

  • Strong interest in visual media
  • Strong interest in mis/disinformation
  • Strong interest in social media
  • Basic understanding or willingness to learn about the American Presidential Election and American Legislative Government Structure
  • Power users of social media are encouraged to apply (TikTok, Instagram, X, etc)
  • Spanish language proficiency is a huge plus but not a requirement

The Quantitative Team will meet for 1 hour a week. The Qualitative Team will have a 2 hour weekly meeting. These meetings will be scheduled based on the students admitted. 

* We will for this DRG focus on migration and refugee populations, particularly from Latin America (including Haiti and the Caribbean Islands) but open to other migrations based on real-world events (ie, individuals from the Middle East). 


Winter 2024

Transgender Science Communication DRG

Organizers: Andrew Beers, albeers@uw.edu, Izzi Grasso, grassoi@uw.edu 
Faculty Sponsors: Dr. Kate Starbird, HCDE, Dr. Emma Spiro, iSchool

The last five years have seen an escalating number of legal challenges towards transgender’s people, and particularly transgender children’s, right to access healthcare. Many of these challenges have been couched in scientific terms, suggesting in misleading fashion that gender-affirming care is neither safe nor effective. In this Directed Research Group (DRG), we conduct an extended case study of one of the first and most severe proposed laws to restrict access to children’s healthcare in the United States, and subsequent legal challenges to this law. Particularly, we focus on expert testimonies submitted by both the law’s defenders and its detractors, which contain extensive reviews of the supposed evidence for or against providing gender-affirming care for children. Our goal is, through an analysis of the citations offered in these expert testimonies, to understand how disparate scientific, journalistic, and activist information sources are collected and mobilized to define the legal terms of transgender’s people’s access to healthcare. More broadly, we seek to understand the long-term collaborative work that goes into producing evidence disinformation campaigns, and how that work is mobilized into the legal sphere.

The specific work of this DRG is three-fold. The bulk of our energy will be spent reading and systematically annotating citations of the expert testimonies submitted in this case, and continuing this process into the past to create a genealogy of information sources regarding transgender healthcare in the last decade. The second part of this work will be, over time, to create qualitative memos recording our developing insights as we annotate this literature. The third part will be to conduct a weekly journal club where we read and discuss prior published work relating to transphobic and scientific disinformation.

We’re interested in students with all sorts of backgrounds and experience levels, who are passionate about this topic and interested in studying issues of disinformation from a research-driven perspective.


Winter 2024

Dehumanizing and Problematic Imagery about Latin American* Migration in the 2024 Presidential Election

Run by: Nina Lutz, PhD Student, HCDE
Supervisor: Professor Kate Starbird

This DRG aims to develop methods, research questions, literature reviews, and tools to analyze rumors in real-time during the 2024 U.S. Election Cycle. In particular, this DRG will focus on visual media (memes, short and long-form videos, photos, infographics, etc) during the 2024 Primary Elections. To limit our problem space, we will center questions on visual media that aims to dehumanize and spread rumors about and within intersectionally marginalized migration populations, particularly Latin American migrants and refugees. 

Students will first work together to create a literature review of related work. Then they will divide into two groups to design and embark upon either a primarily qualitative or primarily quantitative research study. Students will have an opportunity to continue this research in Spring and Fall of 2024 to complete the research study. Our goal is that each research study will be published at an academic conference as a paper or poster. 

Capacity: 8 students 

Student Requirements:

  • Strong interest in visual media
  • Strong interest in mis/disinformation
  • Strong interest in social media
  • Basic understanding or willingness to learn about the American Presidential Election and American Legislative Government Structure
  • Power users of social media are encouraged to apply (TikTok, Instagram, X, etc)
  • Spanish, Hebrew, or Arabic language proficiency is a huge plus but not a requirement

Questions? Email ninalutz@uw.edu 

* We will for this DRG focus on migration and refugee populations, particularly from Latin America (including Haiti and the Caribbean Islands) but open to other migrations based on real-world events (ie, individuals from the Middle East). 


Autumn 2023

Analyzing TikTok user behavior changes

This DRG will be run by HCDE PhD Student Joey Schafer with guidance from Professor Kate Starbird.

In this DRG, we will be analyzing social media trace data  from TikTok users, to understand how their use of the platform has changed.

We are looking for 2-4 new students who have experience with social media (especially TikTok), qualitative coding, and/or visual data analysis for this 2-credit DRG. Students from all academic levels are invited to apply and participate in this project. DRG students will meet for approximately 1 hour weekly and are expected to contribute an additional approximately 5 hours per week. The beginning of the DRG will focus on qualitative coding of previously-collected data, and further analysis as well as other components of the research process, such as interviewing, cleaning transcripts, analyzing transcripts, and writing up research findings will occur in the later portion of the DRG.

If you have any questions about this project please contact schaferj@uw.edu or kstarbi@uw.edu.  


Autumn 2023

Negative Affect Research Group

Led by HCDE PhD Candidate Andrew Beers and advised by Associate Professor Kate Starbird.

Social media metrics privilege “positive” affect. Famously, Facebook, Twitter, TikTok, and other platforms do not have “dislike” buttons, and consequently many social media datasets available to researchers are concerned with positivity: likes, shares, follows, endorsements, etc. And yet public debate often centers around social media’s potential fornegative affect. Many are concerned about the extent and severity of targeted social media harassment, which terrorizes individuals and can mute the online expression of entire communities. Group expressions of disgust, contempt, or ridicule seem to typify everyday interaction on social media platforms, causing a moral panic around “cancel culture” for some and a discourse around the beneficial effects of shaming and consequences for others. Some platform users seem to intentionally cultivate negative affect, transforming their controversial statements into social capital via the influencer economy. Even the utopian dream, rarely reached, of the internet as a forum for productive debate presupposes that disagreement would be a common feature of its everyday usage.
 
This research group seeks to understand how negative affect is expressed on popular social media platforms, what the consequences of those expressions are, and popular debates on the prospect for designing for (or against) negative affect. Half of our time will be committed to reading and discussing published research which investigates negative affect on social media from a variety of methods and perspectives. The other half will be committed to annotating a large dataset of quote tweet interactions on Twitter between popular United States political accounts, including politicians, journalists, activists, influencers, and media outlets. By annotating quote tweet interactions, frequently noted for their negative affect, we aim to both A) develop a repeatable codebook for and a deeper understanding of negativity online and B) generate a seed dataset which can be used to automatically classify future data and better understand the qualities of negativity at scale.


Spring 2023

Cross-platform influencers research group

We are looking to recruit 3-4 students to participate in a Directed Research Group led by Center for an Informed Public PhD students Kristen Engel and Morgan Wack, and advised by Professors Kate Starbird (HCDE) and Emma Spiro (iSchool). 
 
This project will examine how misinformation about elections spreads across social media platforms. Specifically, we will focus on the accounts of individual users that spread large quantities of false and misleading information during the 2020 and 2022 elections to study how their posts differ depending on the social media platform they are using. Students will work with CIP researchers to manually identify and categorize social media posts and comments via qualitative coding methods. 
 
Students would enroll in HCDE 496 (for undergraduates) and HCDE 596 (for graduate students) for 2 credits during the spring quarter and be expected to spend 6 hours per week on the project. Interested students should complete the application form below. Selected students will be given an add code for the course. You can find a description of the study aims and desired student qualifications below. 
 
Project description (2 credits/6 hours per week) 
Despite growing user-bases and influence in American politics, limited data currently exists regarding the spread of misinformation outside of mainstream platforms. This project will enable look to fill this gap by collecting and comparing the actions of “repeat spreaders” of misinformation across four text-based social media platforms: Truth Social, Parler, Gettr, and Twitter. To generate novel insights on both differences in the actions of these repeat spreaders and their interaction with distinct platform users, this project will leverage new data from repeat spreaders of misinformation on Twitter linked to the 2020 and 2022 U.S. Elections. Using this data, we will collectively be able to answer several interesting questions, including:

  • How does the content and subject matter of prominent spreaders of misinformation differ by platform?
  • How does engagement with misinformation-linked posts differ based on the platform policies and user bases?
  • Do different strategies implemented by prominent spreaders predict the virality of misinformation across platforms?
  • How do users on non-mainstream platforms engage with misinformation compared to mainstream social media? 

Qualifications: Students with prior qualitative coding experience, experience developing qualitative coding schemas, and a general understanding of online misinformation, U.S. politics, or non-mainstream social media platforms will be strong candidates for this project.


Spring 2023

Negative Affect Research Group

Led by HCDE PhD Candidate Andrew Beers and Associate Professor Kate Starbird

Social media metrics privilege “positive” affect. Famously, Facebook, Twitter, TikTok, and other platforms do not have “dislike” buttons, and consequently many social media datasets available to researchers are concerned with positivity: likes, shares, follows, endorsements, etc. And yet public debate often centers around social media’s potential for negative affect. Many are concerned about the extent and severity of targeted social media harassment, which terrorizes individuals and can mute the online expression of entire communities. Group expressions of disgust, contempt, or ridicule seem to typify everyday interaction on social media platforms, causing a moral panic around “cancel culture” for some and a discourse around the beneficial effects of shaming and consequences for others. Some platform users seem to intentionally cultivate negative affect, transforming their controversial statements into social capital via the influencer economy. Even the utopian dream, rarely reached, of the internet as a forum for productive debate presupposes that disagreement would be a common feature of its everyday usage.

This research group seeks to understand how negative affect is expressed on popular social media platforms, what the consequences of those expressions are, and popular debates on the prospect for designing for (or against) negative affect. Half of our time will be committed to reading and discussing published research which investigates negative affect on social media from a variety of methods and perspectives. The other half will be committed to annotating a large dataset of quote tweet interactions on Twitter between popular United States political accounts, including politicians, journalists, activists, influencers, and media outlets. By annotating quote tweet interactions, frequently noted for their negative affect, we aim to both A) develop a repeatable codebook for and a deeper understanding of negativity online and B) generate a seed dataset which can be used to automatically classify future data and better understand the qualities of negativity at scale.


Spring 2023

Understanding TikTok User Behavioral Changes After Sudden Bursts of Increased Attention

Led by HCDE PhD Student Joey Schafer with guidance from Professor Kate Starbird

Current social media platforms like TikTok facilitate sudden convergence on particular users or videos, giving them a much larger audience than they were previously accustomed to. This sudden increase in attention can be quite disorienting for users, who are suddenly thrust into a much more visible, public online space. In this DRG, we will be interviewing U.S. TikTok users who have experienced what they self-identified as viral events, in order to understand what this experience was like and the impacts that this had on their use of social media platforms.

We are looking for 2-4 students who have experience with social media (especially TikTok), interviewing, and visual data analysis for this 2-credit DRG. Students from all academic levels are invited to apply and participate in this project. DRG students will meet for approximately 1 hour weekly and are expected to contribute an additional approximately 5 hours per week, such as through background readings, participant scheduling and interviews, transcribing interview recordings, and memoing on themes found in interviews.


Winter  - Spring 2023

Examining the Spread of Election Rumors Online

Led by Sukrit Venkatagiri (Postdoctoral Scholar), Emma Spiro (iSchool), and Kate Starbird (HCDE)

The 2022 midterm elections were the focus of a wide range of rumors and conspiracy theories. In Autumn, in a first stage of research, our team at the Center for an Informed Public identified hundreds of different claims about the election that were false, misleading, and/or unsubstantiated. Now, in a second stage, our team aims to classify these different claims, identify social media posts from a variety of platforms related to these claims, and analyze social media content around these claims to answer a variety of research questions about how rumors spread online during the 2022 election period.

We are looking for students with a range of different skills. First and foremost, all students must have familiarity with social media, an interest in the processes and procedures around elections, and a willingness to engage in qualitative analysis of social media posts. We are also looking for students who, in addition to those interests/skills, have experience with data science (writing code to analyze data), network science, visualization, and statistical/machine learning. We encourage students with journalism and political science backgrounds to apply as well.


Spring 2022

Investigating Content Integrity and Disinformation Risks Across Wikipedia Language Editions

Facilitated by HCDE PhD student Zarine Kharazian, with guidance from faculty advisors Kate Starbird and Benjamin Mako Hill 

This directed research group will conduct a qualitative interview study to better understand “content integrity” risks across Wikipedia language editions, particularly non-English Wikipedia projects. Specifically, we are interested in understanding whether some Wikipedia language editions are more vulnerable to disinformation campaigns and ideologically-motivated editing than others, and why. 

A small team of students will conduct semi-structured interviews with Wikipedia stakeholders and community members, including editors of specific language editions and contributors involved with various cross-wiki monitoring activities. Students will also transcribe and qualitatively code the interviews. While interviews will be conducted in English, students with foreign language skills are strongly encouraged to apply, as there may be opportunities to supplement the interview data with analyses of digital trace data from various Wikipedia language projects.

Students will conduct 1-2 interviews a week with participants over Zoom (scheduling of interviews TBD). Most interviews will last about one hour. Additionally, the DRG will meet in person once a week for 1-2 hours. Outside of meeting times and interviews, the expected time commitment per week is approximately 3 hours — for a total of 6 hours per week. Students should register for 2 credits of HCDE 496/596.

We are looking for students with a range of skills. This DRG would be a great fit for those who have one or more of the following: 

  • Experience with qualitative interviewing
  • Foreign language skills (specifically strong reading ability in a language other than English)
  • Interest in or experience with Wikipedia or other peer production platforms
  • Background in political science, communication, or information studies

Spring 2022

Evaluating Disaster Adaptation through Coding Short Form Internet Videos

This research group is studying social media data from the 2021 Texas Freeze Power Crisis to determine how people adapt to natural disaster events. This DRG is open to both undergraduate and graduate students. Students will qualitatively code short form Internet videos, such as TikToks and Instagram stories, shared in tweets that were posted during the 2021 Texas Power Crisis. We are looking for students who have had some experience with qualitative coding text or other data (experience coding videos is not required). We will meet a total of 1-2 hours per week. Outside of meeting times, the expected time investment per week is approximately 4 hours. This DRG will be 2 credits.

Please note that the videos students will be coding may potentially include distressing content of people experiencing a crisis event. Please keep this in mind if you decide to apply for this DRG. 


Winter 2022

Evaluating Disaster Adaptation through Coding Tiktok Videos

This research group will analyze social media data from the 2021 "Texas Freeze" Power Crisis to determine how people adapt to disaster events. This DRG is open to both undergraduate and graduate students. Students will develop a coding scheme and qualitatively code TikTok videos (shared in tweets) from the 2021 Texas Power Crisis. Some students may also assist with automating detection of TikTok videos from tweets. We are looking for students who have had some experience with qualitative coding text or other data (experience coding videos is not required). Some experience with object recognition within videos would be a plus, but is definitely not required. Outside of meeting times (~2 hours per week), the expected time investment per week is approximately 4 hours. The course will be 2 credits.

Please note that the videos students will be coding may potentially include distressing content (e.g. of people experiencing a crisis event). Please keep this in mind if you decide to apply for this DRG. 

This DRG is closed for autumn and no longer accepting applications. If you have questions, please email Shengzhi Wang at shengzw@uw.edu or Alexa Schlein at alexa412@uw.edu.


Winter 2022

Designing for “Rapid Response” to Electoral Misinformation

This directed research group will design innovative methods and systems to detect and categorize online claims about election integrity. This work is situated within a larger project that aims to rapidly detect, collect, process, and analyze public data from Twitter and other online sources — to uncover misinformation and influence operations related to elections. A primary goal of this DRG is to design processes and tools to support this work going forward. We are looking for students with a range of skills, but most important is a willingness to engage closely with the data (i.e. read and categorize a large number of social media posts), and a comfortability with working with developing technology. Experience with qualitative coding and human-centered design methods are a plus, but not required. Power users of social media are welcome. Students with an interest in journalism, political science, policy, discourse analysis, online activism, and the design of social media platforms are all encouraged to apply. This DRG is at capacity and no longer accepting applications.


Autumn 2021

Misinformation and the 2020 U.S. Election

This directed research group will study how people mobilized around misinformation during the 2020 U.S. election cycle. Students will work in a small team to qualitatively code and quantitatively analyze public data collected from Twitter. We are looking for students with a range of skills, but most important is a willingness to engage closely with the data (i.e. read and categorize a large number of social media posts). Experience with qualitative coding is a plus, but not required. Students with an interest in journalism, political science, policy, online activism, and the design of social media platforms are all encouraged to apply. The DRG will take a hybrid approach to in person and online work and learning. We will meet in person at the beginning of the quarter and will have the option to switch to remote work as the quarter progresses.  


Winter 2021

Tracking The Rise and Fall of Mask-Related COVID-19 Theories

Led by

  • Andrew Beers, 2nd-year PhD student in HCDE
  • Sarah Nguyen, 1st-year PhD student in the iSchool
  • With guidance from faculty advisers Kate Starbird, HCDE, and Emma Spiro, iSchool.

What:

The Center for an Informed Public at UW has been archiving posts from Twitter users engaging in arguments about whether or not to wear a mask. We’ve spent this last year qualitatively analyzing a subset of ~5,000 posts from these arguments, to understand the many different theories that Twitter users are employing to argue for or against public mask mandates. We found that users have a variety of theories about masks — about their ability to block virus particles, their potential harm to users, when they should be used, and more — and defend these theories aggressively using the language of science and a wide variety of external links, images, and videos.

This Winter quarter, we would like to expand this project to our full dataset, which currently numbers in the tens of millions of posts and is growing every day. Specifically, we’re aiming to use automated techniques to classify this larger tweet dataset of arguments into the theories we identified in the first stage of this project. We want to see how the popularity of certain theories about masks changed over time, and whether they responded to external events, such as the publication of a new scientific paper, a change in the severity of the pandemic, or a comment by a politician.

Who:

We’re looking for up to three students with an interest in public health communication, misinformation, and/or natural language processing. We are interested in students familiar with Python and with the Twitter platform, but we strongly encourage students without prior experience in either of these areas to also apply.

What You’ll Be Working On:

We’re looking for a relatively small team, and we can see several tasks for this project over the course of the quarter:

  • Qualitative coding of tweets according to a pre-existing protocol
  • Case-study analysis of events causing changes in the prevalence of different theories
  • Data analysis and visualization of data produced during the course of this project

Expectations/Commitment

  • Attend either one 2-hour or two 1-hour meetings each week, time TBD upon registrants schedules.
  • Work 6 hours outside of the class meeting.
  • Register for 3 credits of HCDE 496/596.