Skip to main content
Research

Alan Marks

Autumn 2025

Emerging Minds: Consciousness in Humans, Animals, AI

Are AI systems conscious, sentient, self-aware or intelligent? What defines artificial general intelligence (AGI) and how does AGI differ from superintelligence? What effects will these developments have on society and on human-centered design and engineering?

In this Directed Reading Group (DRG), we examine current theories of consciousness and intelligence—including in non-human species—to inform these questions. Readings will span foundational cognitive science texts, cutting-edge research papers and select popular-media analyses.

Learning Outcomes

By quarter’s end, you will be able to:

  • Compare major theories of consciousness and their implications for AI.
  • Evaluate arguments for and against the possibility of machine sentience.
  • Articulate potential societal impacts of AGI and superintelligence.
  • Formulate and defend your own position on AI futures.

Course Structure and Expectations

  • Discussion: Weekly, one-hour seminar.
  • Reading Submissions: Two hours of reading per week; submit raw reading notes before each class.
  • Leadership: Starting week 2, each student facilitates one discussion.
  • Final Project: Reflective essay (1000 - 1500 words) synthesizing course materials and class discussions.
  • Preparation: Completion and timely submission of reading notes is mandatory.

Application Note

Space is limited. Applicants should be prepared for an intensive, discussion-driven seminar. Please don’t sign up if you aren’t prepared to do the readings and engage in discussion. 

Enrollment Information

  • Meeting time: Wednesdays, 2:30–3:30 p.m.
  • Credits:  3 hours
  • Who should apply: HCDE students will be prioritized, but students from any department or college should apply. You do not need to have any experience with the topic.
  • To apply: Please complete this Google Form by Aug 31 at midnight.
  • Anticipated notification date: Sept 8, 2025
  • Questions? Email Alan Marks (skrama@uw.edu)
     

Alan Marks's Research Group archive