Living laboratory

    Ahead of the 2018 Immersive Network Summit, DocLab and the Massachusetts Institute of Technology (MIT) Open Documentary Lab formally rubber-stamped their future 5-year collaboration on the IDFA DocLab Immersive Network R&D Program, Nick Cunningham reports.

    Together with partners and artists, DocLab and MIT Open Doc Lab will conduct fundamental research into immersive and interactive media – such as VR and artificial intelligence – using IDFA DocLab as a living laboratory. New research questions will be formulated each year on the basis of the latest developments in interactive and immersive media, together with involved partners and leaders in the field.

    Listen and learn

    “Over the years, one of the conclusions for us has been that somehow artists innovate and experiment faster than big institutions and tech companies – and more creatively – and when you give them a festival deadline and an audience they really knock it up a notch,” explained DocLab chief Caspar Sonnen of the joint rationale. “[But] processing the learnings of all these projects is something that we are not that good at. Researching these things requires collaboration.”

    MIT’s William Uricchio responded: “Our task today is to listen and to learn and to hear from you folks [Summit professionals] what are the questions you want answered. And if we can answer those questions, how can we get the answers to you? Because I think there is too often a mismatch between the academy and the work of makers. We say and do a lot of things, and like a ship in the night it passes by. The makers do amazing thinking with their hands, and it too passes by… How can we connect? How can we build some synergy and how can we help each other raise this to the next level?”

    Experimentation

    Jann de Waal of the Topsector Creative Industry team also addressed the makers in the room: “Experimentation is truly the basis of the design process, so that is why we are very happy to support DocLab. We have all kinds of field labs, but not one in documentary… and personally I think it is very interesting to see how you involve, what kinds of directions you are going [to take], what kind of topics you are going to research. Because we do need a new form of storytelling and I can’t [think of] a better place for this research than DocLab.”

    Algorithmic Perfumery

    One of the DocLab creators working with the new R&D Program will be Dutch maker Frederik Duerinck whose Algorithmic Perfumery was selected for Humanoid Cookbook and IDFA DocLab Spotlight. Back in 2015 Duerinck presented the notorious Famous Deaths at DocLab, which examined/recreated the demise of Colonel Gaddafi and JFK through the olfactory senses.

    Algorithmic Perfumery
    poses the question: what if everybody could have their own personal scent? Using visitor input to train the creative capabilities of an AI system that adapts and learns from every exchange, the outcome is a unique scent that is generated and compounded on-site. “For me this was a super interesting thought,” Duerinck said at the DocLab Conference on November 17. “What if you take this idea of a perfume that determines how we all smell and you give it to an algorithm, and what if this algorithm starts finding and discovering new olfactory spaces where a perfumer with a traditional mindset would never go?”

    Disruption

    “And the more I thought about it, it would be a complete disruption of the perfume industry, [but] I knew that normal funding would never get me far. So I started thinking, I need to go to a different model, and I need to have a mission vision… I wanted to create a system where people could tell their own story in fragrance or in scent, basically to create what they want to create. I wanted to move away from the traditional concept of 300 people in the world telling the rest of the world what it is acceptable to smell like, or how not to smell. That was how my vision became bigger.”

    Frankenstein AI

    Interactive artists Lance Weiler, Nick Fortugno, and Rachel Ginsberg’s A Dinner with Frankenstein AI posed another fundamental question at the Conference: is artificial intelligence a modern equivalent of Frankenstein’s monster? And even more profoundly, what is it like to be human? They sought to answer these questions by throwing a series of elaborate dinner parties for DocLab participants, which led to fascinating and stimulating human-to-human and human-to-machine interaction. “The narrative conceit is that Frankenstein’s monster is an artificial intelligence that is wandering the internet in search of what it means to be human and it has been confused by all the polarization, the toxicity, the extreme hate, the extreme love that it has found, and so it [decides] in real life to learn and observe from them [humans],” Weiler commented.

    While participants ate a lavish two-course dinner, an AI was fed real-time transcripts of all the conversation conducted across the four dinner tables. Each diner wore an earpiece through which he or she received prompts and suggestions – “like a spirit or a muse” suggests Weiler – as to where to take the conversation, all to enable the AI’s better understanding of the human species.

    Ambiguity

    “What is interesting about this project is that it explores this ambiguity, the way that the machine, the AI actually crafts question in real time based upon sentiment,” Weiler continues. “So it listens to what is being said at the table and then renders one of 12 different emotions… and because there is a role of trying to nurture or trying help this AI learn – it is very much like a fish out of water, it doesn’t quite understand the world that it finds itself in – it becomes a very powerful kind of exchange.”

    “At Colombia University School of the Arts Digital Story telling Lab …we build creative systems, so the idea is that this project will have many different forms – at DocLab this happens to be a dinner party,” Weiler concludes.

    Photo by Nichon Glerum.