When: Apr 25 2024 @ 10:30 AM
Where: Hackerman B-17
Categories:
Computer Science Seminar Series.

Refreshments are available starting at 10:30 a.m. The seminar will begin at 10:45 a.m.

Abstract

We have begun grappling with difficult questions related to the rise of AI, including: What rights do individuals have in the age of AI? When should we regulate AI and when should we abstain? What degree of transparency is needed to monitor AI systems? These questions are all concerned with AI accountability. In this talk, Sarah Cen discusses the two components of her research on AI accountability and illustrates them through a case study on auditing social media.

Within the context of social media, she will focus on how social media platforms filter (or curate) the content that users see. In particular, Cen will propose a way to implement regulations on social media that is compatible with free speech protections and Section 230. She will then present a way to test whether a content curation algorithm complies with regulations, producing what we call a “counterfactual audit.” In studying the properties of this approach, she will show that it has strong theoretical guarantees, does not violate user privacy, and uses only black-box access to the algorithm (thereby requiring minimal access to proprietary algorithms and data). She will demonstrate how this audit can be applied in practice using LLMs on a live social media platform.

Speaker Biography

Sarah Cen is a final-year PhD student in the Massachusetts Institute of Technology’s Department of Electrical Engineering and Computer Science, where she is advised by Professors Aleksander Mądry and Devavrat Shah. Cen utilizes methods from machine learning, statistical inference, causal inference, and game theory to study responsible computing and AI policy. Previously, she has written about social media, trustworthy algorithms, algorithmic fairness, and more. She is currently interested in AI auditing, AI supply chains, and the intellectual property rights of data providers.

Zoom link >>