Integrity vs. Innovation: How AI is Redefining Documentary Filmmaking

Rachel Elizabeth Seed, director of the new doc A Photographic Memory, dives into the most pressing current topic in non-fiction cinema.

When director Morgan Neville brought Anthony Bourdain’s journals to life with an AI version of Bourdain’s voice in his 2021 documentary, Roadrunner: A Film About Anthony Bourdain, it sparked a scandal and ethics debate that reverberated throughout the film industry. Three years later, this response seems almost quaint, given the rapid evolution of Generative AI (GAI) tools and their use within our industry — what seemed like a much rarer breach of the journalistic standards of truth has become a top concern in the documentary field.

I have been thinking about GAI in documentaries for a few years, since grappling with using AI audio in my own film, A Photographic Memory, in which I get to know my mother, who died when I was a baby, through the archive she left behind as a writer, photographer and filmmaker. The film begins as a more straightforward search for materials and memories about my mom, evolving into an expression of my imagination, as I begin to seemingly converse with her. To achieve this, I either needed to find existing archival sound bites of her asking questions that I could repurpose for these “conversations,” or I would need to create an AI-generated voice. In the case of Roadrunner, Bourdain actually wrote the words that Neville had him say with AI. In my mother’s instance, I would be putting words in her mouth, so to speak. I ultimately decided to use the hours of audio she created for these conversation sequences. However, in the process, I thought deeply about the meaning and importance of truth and authenticity in documentary art, as well as the responsibility to let audiences know when we take liberties to enhance our stories. (Incidentally, I used AI to help generate the title of this article. Thought you should know!)

A shot from Morgan Neville’s Roadrunner: A Film About Anthony Bourdain.

With the exponentially rapid development of GAI, there has been a collective concern for instilling ethical parameters to its usage in documentary. In November of last year, the newly formed Archival Producers Alliance (APA), helmed by seasoned archival producers Rachel Antell, Jennifer Petrucelli and Stephanie Jenkins, penned an open letter to sound the alarm on the potential dangers of a lack of transparency when GAI is employed in non-fiction films. According to the letter, within documentaries “is the implicit promise to the audience that what is presented as authentic media, is in fact authentic” and use of GAI without transparency creates the “danger of forever muddying the historical record.” While their concerns and guidelines are vital to lead the way for current and culture practices, whether filmmakers and corporations will adhere to them, and the consequences we face if they do not, remain to be determined.

Award-winning filmmaker and cameraperson Kirsten Johnson (and, full disclosure, executive producer on A Photographic Memory) gave a mind-expanding talk about GAI in image-making and filmmaking at IDA’s Getting Real conference in Los Angeles this April. Following her presentation, I spoke with her and other documentary thought leaders to gather their hopes, concerns and ideas for best practices, while also reviewing the Archival Producer Alliance’s (APA) newly issued Guidelines for GAI Use.

An image from Rachel Elizabeth Seed’s (AI-free) documentary A Photographic Memory.

Johnson, who has spent decades creating images as a documentary cinematographer (including the Oscar-winning Citizenfour) and more recently as director on groundbreaking films Cameraperson and Dick Johnson is Dead, has been contemplating the introduction of AI imagery and what that will mean for filmmakers and for our culture at large: “Never in the scope of human history has the global circulation of images happened at the speed and scale it is now; images that are generated by AI in the absence of both human bodies and minds.” The problem is, she says, that an entity acting without a body or mind, particularly one that is programmed by existing human prejudices, can be morally vacant. “[AI] doesn’t understand human stakes. It’s free to not care about us,” states Johnson. So what is at stake when an unscrupulous intelligence, possibly motivated by human capitalistic gain, is employed?

Recently, following a string of resignations in OpenAI’s Alignment team, Jan Leike, who was tasked with ensuring AI systems follow human intent, stated on X: “Building smarter-than-human machines is an inherently dangerous endeavor. OpenAI is shouldering an enormous responsibility on behalf of all of humanity. But over the past years, safety culture and processes have taken a backseat to shiny products.” This highlights the tension between technological advancement and ethical responsibility, which is already impacting the documentary world.

For instance, Netflix was accused of doctoring an image in its true crime film What Jennifer Did, a documentary about Jennifer Pan, who was found guilty of orchestrating a “kill-for-hire” conspiracy against her parents. In the film, we see images of Pan smiling and giving the peace sign; however, the pictures appear to have the usual aberrations of GAI – distorted fingers, elongated teeth and odd background effects. Because the accused subject is a real person, this representation (or potential misrepresentation) of her could affect her legal case as well as an audience’s understanding of her character and of the story. Unlabeled GAI in nonfiction can have unfavorable real-life consequences, and it is breaches like these that prompted the Archival Producer Alliance to develop GAI best practices, as “GAI creating works indistinguishable from true historical material” could easily undermine the authenticity that is foundational to documentary filmmaking.

A marked-up version of an image of What Jennifer Did, showing the tell-tale signs of AI.

Independent filmmakers, already battling for transparency, rights and fair pay, now face the additional challenge of AI. Last year’s Hollywood strikes highlighted concerns about AI threatening livelihoods and intellectual property. Beyond the economic impacts, AI also risks compromising our historical records and audiences’ trust in documentary storytelling.

“As these forces push us to monetize images, the forces that push people with power to use images to control other people’s behavior … that’s going to be weaponized against people,” Johnson speculated in her talk. Political campaigns and commercial advertising have always used imagery (in some cases, propaganda) to sway people to vote and buy. Now, as we face high-stakes elections, it can be more daunting to think of the ease with which AI can be used by unethical actors to create false imagery.

Filmmaker Cecilia Aldarondo, director of You Were My First Boyfriend, which is currently streaming on Max, is worried that capitalism is more important than ethics in art, especially in nonfiction filmmaking. “I’ve been resistant to dabbling in [AI],” she says, “because I’m troubled by how profit motive seems to be trumping the way it is unfolding. Just because something is technologically possible doesn’t mean it’s ethical or even a good idea.”

“The value of art is in the capacity to translate to each other the nature of our interior experience,” Johnson shared. Thus, AI’s role in documentary should be to support, not replace, authentic storytelling.

Kirsten Johnson giving her keynote speech at Getting Real earlier this year.

As a filmmaker who values direct interaction with audiences and handmade set pieces, Jeanie Finlay – director of the 2023 doc hit Your Fat Friend – wonders how AI might affect the originality of human art: “The thing I’m concerned about is more homogeneity. There are already enough restrictions in place if you’re working for streamers.” Here, Finlay refers to a documentary landscape where only the documentaries deemed most commercial – which focus on celebrity, true crime or sports – get picked up. The concern is how AI could exacerbate this already restrictive creative and commercial landscape.

Johnson echoes Finlay’s perspective: “I think all of us are deeply invested in questions of ethics and representation and how does cinema survive in the 21st century in a way that’s meaningful.” But, she continues, “we’re up against something that is not human. So how do we take care of each other, how do we take care of ourselves, how do we keep finding ways to support each other making meaningful work?”

Perhaps the answer lies in what we independent filmmakers do best: to continue, individually and collectively, to tell the stories we are passionate about, holding ourselves and each other accountable in community, while also advocating for streamers and other distributors to adhere to a new industry standard that continues to be a work-in-progress.

Rachel Elizabeth Seed is an LA-based nonfiction storyteller working in film, photography, and writing. Her debut feature film, A Photographic Memory, premiered at True/False 2024 and was cited as “one of the best docs of the year” by Rogerebert.com and “an ingenious, meta doc” by Variety. Rachel’s work has been supported by the Sundance Institute, Chicken + Egg Pictures, NYFA, Field of Vision, the Jerome Foundation, NYSCA, the Maine Media Workshops, the Roy W. Dean grant, the Jewish Film Institute, Jewish Story Partners, and IFP/Gotham Labs, among many others. Formerly a photo editor at New York Magazine, her photography has been exhibited at the International Center of Photography and she was a cameraperson on several award-winning feature documentaries including Sacred by Academy-Award winning filmmaker Thomas Lennon. Rachel’s writing has been published by No Film School, the Sundance Institute and Talkhouse