email print share on Facebook share on Twitter share on LinkedIn share on reddit pin on Pinterest

Cannes 2020 – Marché du Film

Rapporto industria: Realtà virtuale

Il cinema adattivo mira a personalizzare la narrazione in un'epoca incentrata sullo spettatore


CANNES NEXT: LMDP Co sta sviluppando il primo film horror biofeedback che adatta la sua esperienza emotiva in tempo reale, in base ai segnali cerebrali e alle espressioni degli spettatori

Il cinema adattivo mira a personalizzare la narrazione in un'epoca incentrata sullo spettatore
VFC di Charles R. Roy

Questo articolo è disponibile in inglese.

The final panel of Cannes NEXT, “Adaptive Cinema: Creating and Delivering Stories for the Viewer-centric Age”, was fully in keeping with the scope of this section of the Marché du Film Online, as adaptive cinema lies precisely at the intersection of creativity and tech. Quebecois filmtech studio LMDP Co is currently developing VFC, written and directed by Charles R Roy, which is the first horror feature to adapt its storytelling to the user’s biological feedback, and will be released by 2021-22.

(L'articolo continua qui sotto - Inf. pubblicitaria)

Charles R Roy, who is also a director and producer at LMDP Co and at La maison de prod (Ravenous), opened the presentation and introduced the idea of adaptive cinema through biomechanisms. Explaining his inspiration from films where the score plays a predominant role, Roy showed a clip of VFC and presented the evolution of the user-centric age of entertainment. He started with linear, passive activities (cinema and radio), moved on to interactive experiences (action-based multimedia), and finally looked at the future of adaptive experiences that evolve in real time and are applied through smart/biotech and AI.

He also used some examples to demonstrate to what extent interaction in games has evolved from the basic 1970s video games to Red Dead Redemption 2’s elaborate adaptive soundtrack and the VR effort Way to Go by Vincent Morisset. Furthermore, neuro music, which was created in the 1960s by pioneer Alvin Lucier, has moved on thanks to computational neuroscience: Sensorband used biotech wearable sensors in the 1990s, and The Heart Chamber Orchestra utilise real-time heartbeats for music and visuals in live performances. In addition, neurocinema is a fairly recent concept, with examples such as Obsession by Pia Tikka, an installation for five spectators sitting in chairs, involving four screens that adapt the stories based on the audience’s heart rates, and The Moment by Richard Ramchurn, which can be watched by eight people, but only one’s brainwave signals control the edit and the plot.

The idea behind VFC is to scale up these limited, experimental experiences to a wider audience. In this case, biofeedback affects the sound, which is dynamic, while the screening remains linear. Thanks to bioadaptive audio software, the sound changes based on brain activity and facial expressions, as the feature can switch between three audio streams at various stages. Everything is related to sound, so the production had to be adapted to this technology right from the earliest stage, as the script, shoot and post-production needed to be in sync.

On the tech side, Felipe Almeida, senior UX and biofeedback researcher at Immersion, explained that these adaptive technologies are not actually so new, and are used in neuroscience to measure and analyse our reactions by collecting data. The innovative part is the actual use of the data that the biosensors collect, in real time, allowing an experience to be actively adapted as it happens. However, machines still can’t read human emotions without further contextual/physiological data, including brain signals and emotional expressions.

Almeida also analysed adaptive storytelling. Starting from the context of the story, the viewer should be able to both understand and feel the degree of the plot’s abstraction in order to form his or her own interpretation and learn the level of interactivity and awareness that would guide him or her to the message of the story. The change in plot can either happen passively, by letting emotions lead the person to the result, or actively, where the viewer is intentionally manipulating his thoughts and emotions, and hence the story. The final message and the interpretation are also connected through social exchange after the experience, where viewers can share their emotions or try to understand others’. When the plot is linear, it might be easier to understand, while in more abstract, dreamlike stories, the interpretations and results are far more open.

There have already been two test screenings of VFC, and at the latest one this year, a pair of headphones with integrated electroencephalography (EEG) biosensors that allow data input and output, and an IR camera for tracking facial micro-expressions were used in order to capture the emotions of each viewer. The company analysed the individual and overall data collected in order to create different bio-profiles. In their feedback, some viewers said they had dream-like experiences, while others felt the character’s heartbeat as if it were their own. For the complete version of VFC, two audio streams will be used, one in a headset and the other in the cinema, while an app will collect the data and create the bio-profiles.

Stephanie Demers, executive director at Fragments Distribution, focused on the value proposition for audiences, distributors and exhibitors. She explained that individual and social learning through the app can alter interpersonal connections and increase self-awareness, especially if discussions happen between the users so that they can compare their results. Also, the data collected are meaningful for distribution: more test screenings had been scheduled for that reason but had to be cancelled owing to the coronavirus pandemic. Regarding the possible audience, it can also reach new viewers who are interested in new tech, plus hybrid screenings/exhibitions could be organised as special events in cinemas, which would, of course, be priced at a premium.

The presentation ended with Roy, who presented the partners that are already on board, including the Canada Media Fund and SODEC, among others. He also focused on the possible applications of adaptive content – for example, in documentaries, animation, for SVoD platforms, promotional and digital content, tutorials and brain-computer interfaces (both wearables and implants).

(L'articolo continua qui sotto - Inf. pubblicitaria)

Ti è piaciuto questo articolo? Iscriviti alla nostra newsletter per ricevere altri articoli direttamente nella tua casella di posta.

Privacy Policy