HOW VFX CAN PROVIDE SOLUTIONS TO HELP – POST COVID
Some areas of physical film production present obvious problems, and the inability to undertake work in these areas is an impediment to returning to work for some projects.
Visual Effects, and Digital Production have tools to help, but because many of the tools which fall within the category of “Virtual Production” are relatively new, much of the industry is not fully aware of what is available, or how the tools can be deployed to help in the situation we all find ourselves in.
I aim to set out, briefly, the range of some of these tools, and provide some examples and resources to help illustrate.
There is a lot more than I can set out here, and if any member would like a “deeper dive” or to ask questions, I would be happy to help, or connect them with someone who can. All of the below techniques and services are used by many different companies in the UK – I have collated a few of the most succinct clips to illustrate.
PROBLEM – SCOUTING AND SHOOTING ON LOCATION
By using a CG replica of a set or location, film makers can collaborate within a virtual environment, either on a screen, or via virtual reality (VR) goggles, to explore and interactively move cameras. The environment can be built from scans or photography gathered by a lone photographer, or can be from a digital library, or even Google Earth.
Time of day and lighting can be altered in realtime to match real world seasonal conditions. (EG: sun position on a backlot set at Leavesden on a specific date and at a specific time).
Virtual set pieces from Art Dept can be included, and can be interactively re-positioned, and “chess piece” performers can also be included to help find angles and positions.
There are virtual scouting tools available to download and run on a laptop – it’s a very portable solution.
Here is one company’s demo reels of both Virtual Location, and Virtual Set scouting:
Traditional Greenscreen or Bluescreen
Nothing new here, but it’s worth noting that there is now a huge range of extremely high resolution digital landscapes and backgrounds available “off the shelf” so that a live composite can be done during the shoot, to help indicate composition of shots, and lighting.
Camera position and movement can be tracked and recorded so that if background plates need to be picked up later in production when we are able to travel more freely, these can be matched. Here is behind the scenes reel of how much of this work was in Parasite.
LED wall technology
Many people are now aware of the use of large scale LED walls with either shot location footage, or CG environments to replace location shooting, or greenscreen shooting.
LED walls have the advantage of providing interactive lighting, and providing a “real” environment in which to perform and shoot. Shooting backgrounds in camera has the clear advantage also, of saving lengthy and expensive VFX post production compositing.
Where location plates are available, or can be shot, and once they have been selected and stitched together editorially, these can simply be played back on the LED screens.
Where a CG environment has been sourced or created, the real world film cameras can be tracked in realtime to the contents of the background screens so that camera movement and perspective always corresponds precisely.
Here are examples of this technology being used on both a small scale and a large scale:
Sending a one, or two person capture crew to a location may be possible in some situations, to capture environments specific to scenes photographically, to then be generated as CG replicas.
However, new advances in the availability and extremely high quality of CG environments within game engines like UNREAL are tremendously exciting, as demonstrated here.
PROBLEM - SHOOTING CROWDS OR LARGE GROUPS
Greenscreen or Bluescreen shooting of people separately (split-screen), is of course an option for small groups. Multiple in-camera elements shots for crowds are often referred to as “Sprites”.
With a live on-set composite of previously recorded elements background elements, issues of framing and shot composition can be solved.
Where large crowds are required, these can be generated digitally, (and often are), as shooting multiple elements becomes time prohibitive. In the past, motion to drive characters for B/G crowds was captured project specifically, but there are now motion libraries, such as the one in the link below. (It is worth noting that this library also includes horse capture). This negates the need for performers to capture this motion. From a beneficial cost point of view, this also means animators can purchase only what they know will be used once there is a rough cut or assembly.
Motion capture – principal cast
Motion capture technology has moved on, and does not have to be in a fixed studio environment. Solutions for both facial and body capture are now so lightweight that they can be used in any environment with minimal tech support. The following package, called “Home-cap” can literally arrive in the post.
For studio based motion capture, one UK motion capture studio has already set-up for current conditions, and is capturing performers with the director, producers, and animation crew working from a separate location.
Realtime + 3 ref Cams
- Currently Youtube + Google Hangouts. We can stream into ANY video platform
(Zoom/Microsoft Teams/BlueJeans). Still frame ABOVE.
- Production audio piped into the stream
- AD/onstage director proxy wears a mic so can be heard by remote team
- All on stage crew (as can see in photo) are distanced.
- Make up and HMC team wear masks when dealing with actors
- Continuity sheet & notes are on Google Sheets and allows for rapid sharing when needed.
1. Deploy-able Unreal build for remote camera control (Director will get mocap stream to home and be able to fly a camera around or use VR
MORE COMPREHENSIVE VIEWING / READING
Below are two links to a more comprehensive look at, and discussion about Digital and Virtual Production techniques.