CPH:DOX explores the use of AI in documentary filmmaking
- Filmmakers, policy experts and artists unpacked power, authorship and responsibility in AI, arguing for a shift from passive use to active design of storytelling systems

At CPH:DOX's CPH:CONFERENCE, the panel “Rekindling the Machine: Documentary in the Age of AI” moved beyond familiar utopia-versus-dystopia narratives to examine the structural questions shaping AI’s role in nonfiction storytelling. Rather than focusing on tools alone, the discussion centred on power, governance and authorship, positioning filmmakers not merely as users of technology but as potential architects of alternative systems.
Moderated by Kamal Sinclair, the session brought together artist Anna Engelhardt, public-interest technologist Julia Kloiber, filmmaker Marc Silver, and Denmark’s Tech Ambassador Anne Marie Engtoft Meldgaard. Framing the discussion, Sinclair identified two dominant crises underpinning current debates: truth and ownership. “We are being asked, at the same time, what is real and who owns what we create,” she noted, grounding the conversation in both philosophical and material stakes.
Kloiber pointed to power concentration as a fundamental flaw in today’s AI ecosystem. According to her, decision-making remains in the hands of a small group of tech leaders whose ideological frameworks often shape global infrastructures. “Democracies are treated as hurdles to innovation,” she argued, stressing the need to decentralise both technological development and governance. Meldgaard expanded on this, highlighting a lack of intentionality in how technologies are designed. “We have outsourced our collective imagination to a handful of people with similar backgrounds,” she said, warning that such homogeneity inevitably produces systems misaligned with broader societal values.
For Engelhardt, a key issue lies in how technology is framed as inevitable. She challenged the notion that AI is a fixed reality, arguing instead that it remains a process still open to intervention. Drawing parallels with car-centric urban planning, she noted how long it can take to reverse harmful design choices once they become embedded. “We should be careful not to speak about AI as if it has already happened,” she said, advocating for a more active role in shaping its trajectory.
Silver approached the question through storytelling, focusing on how to make invisible systems perceptible. His latest documentary explores what he described as “algorithmic violence,” examining the human cost of opaque digital infrastructures. “We are dealing with something that knows everything about us, while we know nothing about it,” he said. In response, his work uses visual strategies, including LiDAR scans and spatial reconstruction, to render these abstract systems tangible. For Silver, the issue is not technological failure but efficiency. “These systems often work exactly as designed, and that is where the danger lies,” he added.
The discussion repeatedly returned to accountability. While regulatory efforts such as the EU’s Digital Services Act were acknowledged, panellists agreed they fall short of addressing the scale and complexity of the issue. Meldgaard described current policy responses as shaped partly by political urgency, particularly around young users and social media. However, she warned against simplistic solutions. “This is not just about age limits or bans. It is about redesigning systems with entirely different purposes in mind,” she said, contrasting platforms built for private communication with those optimised for amplification and engagement.
Silver also challenged prevailing narratives around regulation, suggesting that debates often misplace responsibility. “We talk about restricting users, rather than addressing the systems themselves,” he noted, calling for a shift towards treating harmful platforms as defective products requiring structural reform.
Beyond critique, the panel explored emerging alternatives. Kloiber highlighted the importance of public-interest technology initiatives, including open-source tools and publicly funded prototypes that prioritise societal needs over profit. She pointed to experimental models where communities retain control over data and decision-making processes, arguing that such approaches could counterbalance dominant commercial frameworks. “We need to start with the problem, not the technology,” she said, cautioning against solution-driven innovation detached from real-world contexts.
Similarly, Silver described experiments with alternative AI systems that allow creators to control both inputs and outputs, preserving the integrity of source material and avoiding extractive data practices. These models, he suggested, could reframe AI as a collaborative tool rather than a mechanism of appropriation.
Engelhardt, meanwhile, emphasised the importance of visualising hidden infrastructures, from data systems to geopolitical networks. Her work examines how digital and physical architectures intersect, revealing how power operates across both domains. “The boundary between the visible and invisible is constantly shifting,” she said, positioning cinema as a key medium for interrogating that boundary.
The conversation concluded with a call for agency. Kloiber rejected the idea of AI as an inevitable force, describing it instead as a narrative shaped by industry interests. “These technologies are influenced by the stories we tell about them,” she said, suggesting that alternative imaginaries could lead to different outcomes. Meldgaard echoed this sentiment, urging both individuals and institutions to take responsibility in shaping technological futures, whether through policy, investment or everyday choices.
Ultimately, the panel argued that the debate around AI is less about machines than about the structures behind them. As Silver summarised, “This is not a story about technology. It is a story about power.”
Did you enjoy reading this article? Please subscribe to our newsletter to receive more stories like this directly in your inbox.















