Published on September 30th, 2021 | by Andrew Bistak
David “Ed” Edwards – Vicon VFX Product Manager #Mandalorian @ILMVFX
We catch-up with David “Ed” Edwards from Vicon to talk about their amazing technology, how COVID-19 has affected the film industry and of course, their collaboration with The Mandalorian!
Tell us about your role at Vicon?
As VFX Product Manager, I am responsible for overseeing the overall direction of motion -capture products targeting the Entertainment markets.
I spend a lot of time reviewing film and game industry trends, listening to creative visionaries and technical specialists about what they need from motion capture, and very clever people from within the company about what we can do with it. My role is critical in ensuring our customers’ voices are heard and reflected in our products, whether that’s through improvements to existing features to help make processes more efficient, or delivering new tools that improve the quality that can be achieved on virtual productions.
User data is critical to understanding problems and solutions, but Product Managers also need to look beyond this to determine which directions are going to have the broadest, most effective long-term benefits for our customers. My overriding goal isn’t just to address the challenges they’re facing now, but to identify those on the horizon and ensure our solutions maintain consistency with existing workflows. Our underlying systems may be complex, but there is always an emphasis on making the customer’s experience using the technology as simple as possible.
As you might expect, this often means having to venture into unexpected directions. Although, none of those directions have included me being roped into wearing one of the suits… yet.
COVID has literally shaken the TV and film industry to its core and Vicon has come to aid with virtual production technologies. Can you walk us through some of these technologies and how they have helped the industry continue?
I think it’s worth emphasising that fundamentally, virtual production is where the physical and digital worlds meet. This involves a number of processes and immersive technologies working together, with the use of real-time rendering of digital worlds being what’s really bought attention to it in recent years.
Before I joined Vicon, my knowledge of Virtual Production was sourced from the same places as most people’s – YouTube videos and buckets full of speculation. I knew motion capture was an important component, but only since working for the company and spending more time on set with customers, have I come to realise how critical it is. The physical camera’s motion needs to be captured and transferred onto its digital counterpart and then the depiction of the 3D world seen on screen needs to be updated – all seamlessly. This is what our camera tracking technology is responsible for and it demonstrates how motion capture is very much at the heart of what makes virtual production successful.
But where virtual production has been successful outside of the technical advantages it’s afforded creators, is in how it’s also provided production companies with considerable flexibility in responding to the global pandemic.
Virtual production was evolving rapidly before Covid-19 arrived, but the limitations imposed on movement in particular have been somewhat negated by ICVFX (In Camera VFX). Crews suddenly couldn’t travel to a filming location, but were able to build something similar digitally, to a photorealistic level of detail then projected onto an LED wall. The use of digital assets means more work could be done on a remote basis, enabling social distancing and even resulting in the sourcing of new talent, and teams of specialists working together while spread across the globe.
Our Virtual Production brochure goes into more detail about some of the specific features available: https://www.vicon.com/cms/wp-content/uploads/2021/07/Vicon-VP-FINAL-1.pdf
What’s the biggest challenge with this technology?
That depends entirely upon whom you ask. Motion capture technicians, film directors, software developers, and on-screen talent all experience virtual production differently, so, the challenges can vary.
For example, if you asked any of our sales team over the last eighteen months this question, I daresay they’d tell you that simply defining the technology has been the biggest challenge. Virtual Production has meant many different things to many different people and I think one of the biggest challenges Vicon faced as a company was identifying the objectives for our next series of software updates.
That’s where I’m especially proud of and grateful for the work my predecessor and our development teams achieved with the Shogun 1.6 release. Establishing a feature set as broad and effective as they did, during the many challenges of the last eighteen months wasn’t simple and the feedback from customers suggests it was very much worth it. Warner Bros and Framestore are just a couple of those who have taken to tools like our new Object Tracking (which allows for lowest latency on the cameras whose motion they’re capturing) very quickly, are seeing the benefits in their day-to-day experiences and are now providing input toward what the next iteration will be.
.
.
As a result, the biggest challenge facing us at this moment is how we adapt our solutions to the practical realities of Virtual Production. Motion-capture for entertainment was previously an isolated process, where performances took place in very open, clutter-free and simplified spaces. As creators get more confident with the processes involved, we’re seeing an increase in the complexity of their stages and more components being introduced that are typically considered highly challenging for performant motion capture.
Where do you see this technology in 10-years?
Whenever I get asked this, I find myself channelling my inner Alan Grant (that’s a Jurassic Park reference, for your younger readers) and saying “The world has just changed so radically, and we’re all running to catch up.” What I mean by that is I think we’re in the eye of the storm at the moment, where the evolution and adoption of Virtual Production are both occurring rapidly and until things have settled, it’s not entirely clear what its role in the future will be.
But your last question moves quite nicely into this, as I think the growth of the technology is going to be defined by the needs of the customers and the world we’re operating in. As more films are created with virtual production, creators are going to become increasingly confident with what they can do and as the quality of their output grows, so too will audience expectations.
I can see that manifesting in different ways. Obviously, the fidelity and realism of these immersive worlds is going to keep evolving, but so too will our means of engaging with them. More live experiences and events such as TV game shows will likely start making use of virtual production in different ways and I can see the technology growing to support a wider range of user cases. The growth of location-based VR experiences is already demonstrating that there’s a strong desire for audience members to become ‘part’ of the digital world and I only see that demand increasing.
On that note, full-body digital characters overlapying actors in a non-digital environment is something very much in its infancy. We’ve done some basic, internal experiments with this ourselves and it’s already raising some interesting challenges. But as we’ve seen with the evolution of virtual production so far, what is considered cutting-edge at one point in time can very much become a fundamental component by the time a decade has passed.
Can you tell us more about Stagecraft and Vicon’s work together on the second season of The Mandalorian?
The strength of the audience’s response to the show and the technical team’s enthusiasm for the flexibility and power that Stagecraft’s technology provided, meant it was only logical that they’d build on these principles for season two.
The team at ILM were very clear that they wanted to grant the production teams even more power and flexibility, so that audience expectations could be raised again through even more vivid environments and visceral set pieces.
Facilitating this naturally meant increasing the size of their physical stage, but ensuring the surrounding technology could provide the required levels of motion capture wasn’t just a question of ‘buying more gadgets’. It was critically important that they invested in cameras that were suitable for the very specific, but diverse and wide-ranging challenging scenarios they would be encountering in the second season.
We’ve worked with ILM on various projects over the years, so began consultation on their needs at a very early stage and ultimately provided them with our large motion capture system. This provided them with high quality coverage over an exceptionally large space, suitable for both full body motion capture and object tracking.
So that journey wasn’t so much a question of ILM just wanting a bigger stage and us supplying the hardware. We collaborated with them from the beginning to ensure that each technical decision was being informed by the needs of their studio teams and that iterations to our software would maximise the benefits they received from it.
Which is a very important thing to highlight. As much as we’ve been responsible for helping Stagecraft develop and improve, the same can be said for their contribution to Vicon. The various features I’ve pointed out which our software packages offer, have been enhanced and informed by Stagecraft being very concise and clear in what their current challenges are, and where they wish to take the technology in the coming years.
What I think speaks to the performance and value of this relationship, is that it provided ILM with a blueprint for future stages. They have since expended from a single stage to five spread across the globe, allowing them to shoot multiple productions simultaneously.
You can learn more about our work with ILM at:
https://www.vicon.com/resources/case-studies/vicon-x-ilm/
What about the processing power of this technology?
Carrying on from our experience of working with ILM, the way they once described Vicon’s technology has really stuck with me – it just works. That might not sound like much, as we are all surrounded by technologies that are incredibly easy to use despite being the thing of myth and sorcery but a decade or two ago. But it really speaks volumes about the capabilities of our underlying technologies and importantly, their scalability.
To put this into context – Virtual Production really has exploded over the last eighteen months and some of the challenges involved are completely new to motion capture. While many of the underlying principles already existed in our software, there was a considerable challenge presented by the requirement to evolve these to such a level that they fulfilled our customer’s needs. We achieved this and while that’s in no small part thanks to the expertise and determination of our development teams, the raw computational horsepower of our technology is no less critical. The speed at which objects are tracked, digitised and that the data streamed over to the visualisation software is still staggering to someone who grew up in a time where the prospect of VFX and game tech merging the way they have seemed a distant fantasy.
That our systems have scaled to customer demand in such a short period of time and under challenging circumstances, I believe is testament to the underlying power of Vicon’s software.
What I will say in that regard, is it’s important not to forget the role of the user experience in this. Powerful technology is great for all of us, but it’s potential is determined largely by the simplicity of the systems used to execute it. Our software is complex but the absolute opposite is true of the user’s experience when operating it. Customers want and need the power we provide to be available at the end of their fingertips, not abstracted behind language and processes that increase the distance between them and the vision they’re reaching for.
Vicon’s software has come a long way in this respect. ILM and others have spoken highly of how the intuitive nature of our systems has benefitted their workflows, particularly while getting grips with the somewhat unpredictable nature of Virtual production. As someone whose background is also in visual design myself, facilitating user experience this way will remain a priority.
You must have some insanely powerful machines running to create these virtual worlds?
If I based my answer purely on the weight of my work laptop – then yes and I’ve learned all about the benefits of ‘upper body strength’ as a result. But the truth is actually far broader and accessible than that, which I think is a great sign for the technology’s future.
Historically, you could draw a pretty clear line between worlds that were generated with offline VFX rendering packages and those of video games. VFX long specialised in high fidelity, photorealistic visuals (although these demanded considerable horsepower) while games dominated the real-time space. The broadness of consumer demand for games means they’ve always had to be scalable to different machine specifications – not that anyone who tried running the original Crysis on higher settings back in 2008 would agree.
But the benefit of these two worlds converging is that we’re really starting to see this distinction disappear. It’s no longer a surprise to see a video game engine running visuals that are comparable to, or in some cases surpass those of blockbuster movies.
We’ve been working closely with our friends at Dimension Studio to create AKUMA, which encapsulates this as well as any creative work done with virtual production I’ve seen. This short film is being shot in Unreal Engine, utilising MetaHumans and motion capture in unison – all in real-time.
Yes, you’re absolutely going to need a beast of a machine (or in some cases – several machines) to power some of the more advanced tech like LED walls, but there’s plenty that can be achieved otherwise.
Do you feel virtual production is the way of the future?
It’s important to remember that Virtual Production has actually been around for much longer than people realise. TV shows like The Mandalorian and their use of LED walls for ICVFX have certainly thrust it into the public consciousness, as has the use of the associated technologies in response to the global pandemic. But many of the underlying technologies and processes have actually been around since the late 90s, with films like Star Wars: Episode I – The Phantom Menace and The Lord of The Rings: The Fellowship of The Ring all making use of elements such as real-time rendering.
It’s also very easy to slip into hyperbole when discussing prospects like these. I speak to some people who can’t see the future of the movie industry being in any other direction, while others feel it will become more of a specialist solution when traditional filming methods become more practically possible again.
I think there’s merit to both these arguments and Virtual Production is not the first subject area where we’ve heard them made. There was a time when CGI was viewed as being the death of practical effects and while it’s certainly become a mainstay within the industry, you can’t say we’re living in a time where it absolutely dominates. Many technologies or pipelines see a swell in adoption rates as creators get to grips with them and start demonstrating the unique opportunities they afford. Eventually, things settle and what was once exciting, cutting-edge becomes a fundamental part of how things are done.
So, in short – yes, I most certainly see Virtual Production as having a significant role in the future of the industry, just as it has had a more significant role in its past than many are aware of. The exact form this takes, how it evolves and which aspects become the most popular is going to depend a lot on how creators continue to embrace it.
That’s what excites me the most about where we are right now. We’re seeing the VFX (and games industries) grow rapidly, cultivating entirely new trends and technologies in the process, while the barrier to entry is also diminishing. Creators have more power and flexibility than they’ve ever had before, it’s exciting for Vicon to be a central part of that process and while I can’t currently get into specifics about what’s on the horizon, I think there’s a lot for movie and game enthusiasts to be excited about.
Lastly, can you tell us of any other TV projects that Vicon is working on?
Any fans of the award-winning show Dark should keep their eyes out for 1899. It’s a mystery drama currently being shot on the Dark Bay Virtual Production stage which we helped establish, along with Framestore.
Alter Ego recently aired on Fox which utilises Vicon systems to power its digital avatars of live performers and on the subject of music our technology has also been used on Billie Eilish’s ‘Happier than Ever’ world tour.
Many of the productions we’re currently supporting are under NDA, but if you follow our company website, YouTube and Linkedin profiles, news about each of these will be shared as soon as it becomes available.
https://www.youtube.com/c/Vicon
https://www.linkedin.com/company/vicon