Skip to main content

Building a case for A.I.

Sandra Ristovska

Sandra Ristovska in the Wolf Law courtroom. Ristovska is spending the academic year at Stanford’s Center for Advanced Study in the Behavioral Sciences as she fleshes out her research into visual evidence and the U.S. justice system. Photo by Kimberly Coffin.


 

 I am surrounded by people who are at the top of their fields, working in areas like artificial intelligence, democracy and equality, immigration, the environment. It’s incredible.”

Sandra Ristovska

The first day of classes at CU Boulder this fall was also the day Sandra Ristovska got the keys to her office—or study, as such spaces are known at Stanford University’s Center for Advanced Study in the Behavioral Sciences.

Unsurprisingly, she sounds much like a new student herself, excited about having so much to look forward to and full of energy and enthusiasm about what awaits her. (Like a new CU student, she’s quick to gush about the views, which in her case include forests, palm trees and dramatic overlooks of Silicon Valley.)

“I am surrounded by people who are at the top of their fields, working in areas like artificial intelligence, democracy and equality, immigration, the environment. It’s incredible,” said Ristovska, associate professor of media studies and director of the college’s Visual Evidence Lab.

Being selected as a fellow to the center is a high honor. Among its alumni, CASBS counts a host of Nobel, Pulitzer and MacArthur winners, along with such luminaries as Ruth Bader Ginsburg and George Shultz, U.S. secretary of state under Ronald Reagan.

Just being included in such company would be distinguished enough, but at the outset of the yearlong residency, Ristovska learned she was awarded the Leonore Annenberg and Wallis Annenberg Fellowship in Communication at CASBS.

It’s a full-circle moment for Ristovska, who earned her PhD from the Annenberg School for Communication at the University of Pennsylvania; she said it was “very meaningful and very special” to get an endowed fellowship from the family.

CASBS is renowned for providing a home for scholars engaged in pioneering research into complex contemporary problems. The interdisciplinary nature of each class of fellows encourages the kinds of stimulating conversations that help push researchers outside their niches and make broader connections to major societal challenges.

Ristovska is counting on that cross-pollination to help her in drafting her next book, tentatively titled Deepfaking Images, which will offer a legal and social history of the use of technology to manipulate evidentiary media.

New twist on an old problem

Although the use of generative A.I. to distort real images, or cook up fake videos, is certainly a contemporary challenge—the Visual Evidence Lab is examining this topic in depth—it’s just the latest tool in a problem going back more than a century. For instance, video can be sped up or slowed to distort its meaning, while photo manipulation is as old as photography itself.

What interests Ristovska about the use of visual assets in court is what such evidence indicates about access to justice.

“Oftentimes, the best-resourced party has the language and ability to use or challenge this type of evidence when it’s presented against them—or to hire videographers or software experts to present such evidence in the first case,” she said. “In criminal cases, this tends to tilt the scales in the prosecution’s favor.”

Published works of CASBS fellows are permanently stored in the center’s Tyler Collection; when completed, Ristovska’s book will be among them. It’s fitting, since already her work is benefiting from interactions with other fellows.

“We have lunch every day with the other fellows, and of course we all ask each other what it is we do,” she said. “It’s invigorating to tell people about my work, hear their excitement about it and also listen to their ideas for how the different things they focus on might get me to think differently about my book.”


Joe Arney covers research and general news for the college.