Virtual Production News for Broadcast Professionals https://www.newscaststudio.com/tag/virtual-production/ TV news set design, broadcast design & motion graphics Tue, 25 Jul 2023 22:24:54 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://www.newscaststudio.com/wp-content/uploads/2019/07/cropped-newscaststudio-icon-32x32.jpg Virtual Production News for Broadcast Professionals https://www.newscaststudio.com/tag/virtual-production/ 32 32 46293266 AE Live acquires Silver Spoon with eye towards US market https://www.newscaststudio.com/2023/07/25/ae-live-acquires-silver-spoon-with-eye-towards-us-market/ Tue, 25 Jul 2023 22:24:40 +0000 https://www.newscaststudio.com/?p=120746 AE Live has announced its acquisition of Silver Spoon, a provider of innovative, real-time virtual ... Read More

The post AE Live acquires Silver Spoon with eye towards US market appeared first on NewscastStudio.

]]>
AE Live has announced its acquisition of Silver Spoon, a provider of innovative, real-time virtual production content. The deal, which was announced on July 25, 2023, is part AE Live’s ongoing efforts to cement its position in the US broadcast market.

Silver Spoon, based in Brooklyn, has gained attention in the sports broadcasting world for its use of virtual production technology.

Silver Spoon’s integration into AE Live group is expected to enhance the latter’s capabilities and contribute to the wide range of services it provides to its global clients. The acquisition follows that of New Zealand-based Ignite in 2020, which established a significant presence for AE Live in Singapore and Malaysia.

“Silver Spoon’s innovative virtual production capabilities will dovetail perfectly with our award-winning creative agency, Ignite,” said AE Live CEO, Nick Baily. Baily also noted the timing of the acquisition aligns perfectly with AE Live’s aspirations to be a significant player in the US broadcast sector, following recent contract wins.

Notable projects for Silver Spoon have included a partnership with CBS and Nickelodeon for the “Nickmas” game and augmented reality ‘team-mojis’ for CBS Sports’ March Madness coverage.

Silver Spoon was also instrumental in Fox’s “Alter Ego” entertainment program and other commercial productions.

The post AE Live acquires Silver Spoon with eye towards US market appeared first on NewscastStudio.

]]>
120746
ESPN embraces virtual production with Catalyst Stage https://www.newscaststudio.com/2023/06/27/espn-virtual-production-catalyst-stage/ Tue, 27 Jun 2023 11:58:46 +0000 https://www.newscaststudio.com/?p=119865 ESPN has unveiled an innovative, immersive studio at its campus in Bristol, Connecticut, dubbed the ... Read More

The post ESPN embraces virtual production with Catalyst Stage appeared first on NewscastStudio.

]]>
ESPN has unveiled an innovative, immersive studio at its campus in Bristol, Connecticut, dubbed the Catalyst Stage.

Catalyst Stage builds upon the technological and conceptual milestones achieved during the production of Disney+’s hit series, “The Mandalorian.” However, the stage extends beyond the series’ notable accomplishments, providing a scalable and adaptable platform that can support both live multi-camera productions and cinematic-quality projects.

The new studio’s technology can render any setting, real or imagined, from mimicking an existing studio to a stadium. This is made possible by the collaborative effort of the Disney Entertainment and ESPN Technology teams, who have developed proprietary software to power the Catalyst Stage.

“Catalyst Stage removes production and creative limitations that come with singular, purely physical environments. In this studio, we can place our show and talent anywhere – the top of a skyscraper, a virtual twin of an existing studio, a newly-created fully-rendered studio, or the middle of a football field…you name it,” said Tina Thornton, head of content operations and creative surround, ESPN.

“If we imagine a setting, we can create it. And it allows rapid scalability in live, custom cinematic-quality, and advertising productions.”

Making this possible are tools including Ghostframe, Unreal Engine, Disguise XR, Pixotope and a robotic camera system from Mark Roberts Motion Control. This also marks the first fully robotic studio for ESPN.

Operating on 26 real-time servers, the stage can render 11 million pixels in its LED volume, refreshing 7.6 thousand times every second. ESPN notes the studio uses the largest and most complex Disguise XR system ever built for television production – processing 120 gigabytes of data every second. NEP Group handled the integration with LED from ROE Visual

“Our new Catalyst Stage showcases the ingenuity of the Disney Entertainment & ESPN Technology team, expanding the boundaries of production and elevating the canvas of possibilities for content creation. While we are introducing this at ESPN, Catalyst Stage will enable new opportunities in advertising, and showcases the extraordinary capabilities we are advancing for the entire Walt Disney Company,” said Aaron LaBerge, president and CTO, Disney Entertainment and ESPN Technology.

The studio is designed to adapt to ESPN’s production needs and can output in a variety of formats such as 1080p or UHD all the way to 4K.

While the Catalyst Stage has initially been built for ESPN, the advancements from this project will benefit other parts of Disney in the future, the company notes.

ESPN’s Catalyst Stage joins the revamped Studio A in Los Angeles for Fox Sports, which debuted last fall for the NFL season. That studio uses similar production tools to blend physical and virtual elements in a large LED volume.

The post ESPN embraces virtual production with Catalyst Stage appeared first on NewscastStudio.

]]>
119865
Arcturus adds tool focused on virtual production volumetric capture https://www.newscaststudio.com/2022/12/12/arcturus-adds-tool-focused-on-virtual-production-volumetric-capture/ Mon, 12 Dec 2022 23:35:54 +0000 https://www.newscaststudio.com/?p=115613 Arcturus has released updates for its HoloSuite platform including a new beta feature developed to ... Read More

The post Arcturus adds tool focused on virtual production volumetric capture appeared first on NewscastStudio.

]]>
Arcturus has released updates for its HoloSuite platform including a new beta feature developed to connect volumetric video clips, and seamlessly blend them together. With this ability, content creators can populate full virtual production scenes with real people, build live-action branching narratives with no gaps and offer original experiences for the metaverse using people instead of CG avatars.

“Volumetric video isn’t just offering content creators new ways to do old things better; with the right tools, it offers possibilities that simply weren’t there before,” said Kamal Mistry, CEO of Arcturus. “Our new tools will open up a huge range of potential uses across multiple industries, and soon we will begin to see content unlike anything that’s ever been seen before.”

The HoloSuite update sees the eagerly anticipated “Blend” tool released in beta for the first time. With Blend, HoloSuite users can take a volumetric video clip featuring a live-action performance, and connect that clip to other related clips, creating an imperceptible transition from one segment to another. Artists can also use Blend to loop the end of a clip back to its start, creating a recording of a photorealistic subject moving and interacting that continually repeats.

The possibilities range across multiple industries and platforms, but game engine users with the HoloSuite plugin will see an immediate benefit. Creators can now take a blended volumetric video clip and insert it into a digital environment to populate a virtual scene with live-action 3D performances, without needing to create countless digidoubles, or try to force 2D video into a 3D setting. The Blend feature will also give users the ability to build live-action branching narratives with no perceptible change between video tracks. HoloSuite can take multiple recordings of a subject and smoothly blend them together, giving the end-user a seamless experience as they progress down multiple paths based on their choices. That will give creators a completely original way to explore new forms of digital storytelling, build AR/VR experiences and more.

Blend is the result of years of research and development by the award-winning Arcturus team, as well as requests from volumetric video users. Working hand-in-hand with HoloSuite users and clients, Arcturus will continue the current beta testing over the next few months, fine-tuning the Blend functionality and ensuring both compatibility and stability. Once the beta period has concluded, Blend will become a standard feature. All current HoloSuite users will be able to participate in the beta.

In addition to the new beta tool, HoloSuite will receive a host of quality-of-life improvements and upgrades. Unity users with the HoloSuite plugin will now find improved OMS playback, offering better viewing controls for volumetric video files within the game engine. Additional support for Unreal Engine 5 OMS playback is on the way as well.

Updates include:

4DViews Capture Import – New native file support will allow users to directly import files from 4Dviews.

Generate “Smooth Normals” – HoloSuite users can generate “smooth normals” to create better surfaces for re-lighting, and allow for quick lighting changes on volumetric subjects in game engines.

New Lighting Preferences – Environmental lighting can now directly be added within HoloEdit for more dynamic viewing.

 

The post Arcturus adds tool focused on virtual production volumetric capture appeared first on NewscastStudio.

]]>
115613
Unilumin and Pixotope form partnership to bring turnkey LED/XR solution to broadcast https://www.newscaststudio.com/2022/12/12/unilumin-and-pixotope-form-partnership-to-bring-turnkey-led-xr-solution-to-broadcast/ Mon, 12 Dec 2022 21:28:49 +0000 https://www.newscaststudio.com/?p=115608 Unilumin has formed a strategic partnership with Pixotope. The partnership reflects both companies’ commitment to ... Read More

The post Unilumin and Pixotope form partnership to bring turnkey LED/XR solution to broadcast appeared first on NewscastStudio.

]]>
Unilumin has formed a strategic partnership with Pixotope. The partnership reflects both companies’ commitment to serving the broadcast market with purpose-built XR solutions that meet the evolving needs of the industry, the companies noted in a statement. 

Pixotope XR Edition streamlines and simplifies the setup and operation of XR environments, reducing bespoke engineering requirements to power advanced XR workflows. Unilumin LED volumes, which are used by Televisa Mexico, The Daily Show with Trevor Noah, NBC’s New Year’s Eve broadcast, The U.K.’s X Factor, and Sky TV’s Sport Studio among countless others, feature a high-tech small-pixel design which preserves image output quality in XR environments, optimizing viewer experience. This partnership provides an advanced XR solution in a simple-to-deploy turnkey package.

Unilumin and Pixotope will showcase the XR solution at the 2023 Integrated Systems Europe (ISE) Show in Barcelona from January 31st to February 3rd on Unilumin stand (3C400). The showcase will include demonstrations of Pixotope XR Edition powering a two-wall Unilumin LED volume XR stage with its graphics and camera tracking capabilities. Show attendees will be able to see the benefits that Unilumin displays bring to XR workflows and environments thanks to their HD high refresh rate, wide viewing angle, fewer scanning lines & moiré, with contents on displays synchronized with the camera for reduction in lag and delay.

“We are thrilled to be partnering with Unilumin for ISE; they’re the defacto experts in LED volume technology and provide the vast majority of volumes being used in the M&E industry,” said Pixotope CEO Marcus Brodersen.

“This partnership enables us to achieve our mutual goal of making extended reality more accessible to the broadcast market. By cross leveraging our individual expertise we’ll be able to problem-solve and optimize to make such workflows more streamlined and efficient.”

“We look forward to welcoming our new partners to the stand at ISE in January,” said Jaelyn Li, General Manager of Unilumin Rental Industry. “Pixotope is an ideal complementary partner for Unilumin. Their XR Edition removes much of the complexity and guesswork around XR, which allows users to tap into the creative potential XR stages without needing engineering expertise or bespoke solutions. We can’t wait to see our technologies come together and show ISE attendees what’s possible.”

The post Unilumin and Pixotope form partnership to bring turnkey LED/XR solution to broadcast appeared first on NewscastStudio.

]]>
115608
Mo-Sys delivers LED virtual studio for German World Cup broadcast https://www.newscaststudio.com/2022/12/08/mo-sys-delivers-led-virtual-studio-for-german-world-cup-broadcast/ Fri, 09 Dec 2022 01:26:54 +0000 https://www.newscaststudio.com/?p=115530 Mo-Sys Engineering delivered a LED Virtual Production studio for Plazamedia ahead of its prestigious World ... Read More

The post Mo-Sys delivers LED virtual studio for German World Cup broadcast appeared first on NewscastStudio.

]]>
Mo-Sys Engineering delivered a LED Virtual Production studio for Plazamedia ahead of its prestigious World Cup broadcast.

Following experimentation to find the optimum technical solution and a successful proof-of-concept, leading content solutions provider and sports TV producer Plazamedia enlisted Mo-Sys to deliver its innovative LED Studio.

This complete turnkey solution incorporates real studio elements with optional XR ability. Mo-Sys provided technical guidance from concept to delivery, virtual production training, and project management of the installation within the impressive 690 square meter studio at Plazamedia’s headquarters in the AGROB Medienpark near Munich, Germany.

As the exclusive German rightsholder, world leading telecommunications giant Deutsche Telekom will be the first to utilize the ground-breaking Mo-Sys powered studio, from where all 64 World Cup 2022 matches will be broadcast via MagentaTV across 29 production days. After the World Cup, the studio will be re-opened as a full Virtual Production XR LED Volume.

Plazamedia’s Director of Productions & Costumer Relations, Hansjorg Baumgartner, said: “We have worked with Mo-Sys on several VP projects and that experience gave us confidence to entrust their team to deliver this state-of-the-art XR studio. Mo-Sys’ new and innovative camera switching technology for LED VP which maintains render power across multiple cameras without increasing the XR engine count, together with their knowledge, technical support, guidance, and ability to bring in key partners has been invaluable to our success here.”

The LED studio uses four cameras, each of which has a Mo-Sys StarTracker to generate the 6-axis camera and lens tracking data that drives the perspective view of the virtual set. The tracked cameras work in conjunction with Mo-Sys’ combined LED content server and on-air graphics system bMR (Broadcast Mixed Reality), which delivers the virtual set content from the graphics engine.

The bMR system also solves the problem when switching between full resolution tracked cameras shooting a virtual set displayed on an LED wall, where the camera outputs switch faster than the LED wall can update with the correct camera background. Mo-Sys bMR orchestrates the switch to ensure only the correct camera perspective view is visible.

Mo-Sys CEO, Michael Geissler adds: “The entire team at Mo-Sys is incredibly proud to have played such a significant role in the delivery of this incredible XR studio project. We would like to thank Plazamedia for their trust in us and we look forward to further strengthening our relationship.”

Mo-Sys worked in collaboration with Plazamedia’s in-house technical and graphics teams as part of their project management, which ensured the complex studio build was delivered on schedule. Mo-Sys was responsible for all virtual production technologies, which included the 120sqm LED wall from Spanish manufacturer, Alfalite. Its Modularpix Pro ORIM 1.9mm pixel pitch panels were specified for the 33m x 3.5m U-shaped LED volume, which is driven by a total of eleven render nodes.

Plazamedia is at the very forefront of broadcaster’s adopting LED virtual studios.

World Cup fans are set to be truly immersed in the action from Europe’s most advanced XR studio thanks to the company’s commitment to raising production values and improving the viewer experience. Through its use of bleeding-edge Mo-Sys technology, the studio will seamlessly combine the presenters and foreground studio features with augmented reality (AR) virtual set extensions and tracked graphics to deliver a unique 360-degree experience.

The post Mo-Sys delivers LED virtual studio for German World Cup broadcast appeared first on NewscastStudio.

]]>
115530
Column: The past and present of virtual production—and where it could go next https://www.newscaststudio.com/2022/11/28/column-where-virtual-production-will-go/ Mon, 28 Nov 2022 12:55:19 +0000 https://www.newscaststudio.com/?p=114675 When you hear the phrase “Virtual Production,” what do you envision? Generally, we think of ... Read More

The post Column: The past and present of virtual production—and where it could go next appeared first on NewscastStudio.

]]>
When you hear the phrase “Virtual Production,” what do you envision?

Generally, we think of it as an amalgamation of real-time production techniques that combine physical subjects with digital “magic” to create eye-catching content.

But while VP has become a buzzword in recent years, it’s not entirely new. It wasn’t always known as “virtual production,” either. In fact, the earliest form of what we now call ‘VP’ happened 75 years ago in British cinema. Production designer David Rawnsley began using back projection in his ‘Independent Frame’ system, pre-planning extensive storyboards and models along with camera and talent movements. While his Independent Frame technology never truly caught on, the principles of virtual production remain the same today; using planned pre-visualization, tracking camera and talent movements, and simulating environments or assets in real time. 

A recent history

However, the history of virtual production that we will focus on begins at the turn of the 20th century. Once digital editing entered the picture, filmmakers could transcode the film to digital media and edit material even while it was still being shot. This eventually evolved into on-set editing, and then into adding VFX from personal desktops. As a result, rigid feedback pipelines became agile, fluid, and accessible in real time.

A glowing blue puck was used on Fox Sports’ live ice hockey broadcast in 1996 to make the puck more visible on screen, marking the first use of augmented reality (AR) during a live sporting event. This ‘FoxTrax’ system was billed by Fox as “the greatest technological breakthrough in the history of sports,” bringing live AR into the homes of millions of sports fans.

Just a couple of years later, in 1998, an NFL live broadcast displayed a yellow line to denote a ‘first down line’ location on the field. This was quietly revolutionary. Before then, sports fans would squabble amongst themselves over whether their team had earned a first down until they heard confirmation from the commentator or referee. With this simple yellow line, the NFL changed the way we watch the game. It offered real-time feedback to viewers about the state of play in a way that not even in-person spectators could see – and marked a turning point for the use of virtual production in live broadcasts.

Later, advancements in filmmaking technologies led to the use of virtual cameras for pre-planning camera moves and simulcams, which superimposed virtual characters over live-action in real-time. Peter Jackson’s The Lord of the Rings: The Fellow of the Ring (2001) and James Cameron’s Avatar (2009) are some of the earliest examples of these VP technologies in Hollywood filmmaking.

Convergence of circumstances

After initial breakthroughs in real-time CG, AR, green-screen, and other live graphics started becoming commonplace in the late 2000s. With advancements in visual and consumer technologies, the public could have AR in their pockets – from Google Maps Live View to Pokémon GO. 

Gaming technologies also improved to the point where graphics started becoming truly immersive. GPUs and graphics cards have become faster and more powerful, and game engines like Unreal Engine and Unity have made significant progress. Released by Epic Games in 1998, Unreal Engine has evolved from being a game rendering technology into a real-time graphics engine that pushes the boundaries of photorealism across industries.

As a result of these technological developments, streamed content, social media networks, and gaming apps emerged as challengers to traditional cable TV, creating a shared social experience – and mainstream content creators had to push their boundaries to keep up.

From the earliest applications of AR in live broadcast, we now see this technology used across all forms of sports broadcasting. In addition to enhancing viewer engagement like these NFL team mascots, graphics can also be used to incentivize brand investments, like this Halo Infinite x Oregon Ducks AR experience.

Virtual studio techniques, such as those used on the Weather Channel, can bring something as simple as a weather report to life. Viewers can be fully immersed in the current weather conditions as presenters are thrust into the middle of a storm. With live 3D graphics overlaid onto a green screen, we can watch the talent interacting in real-time with the virtual environment around them, enhancing our engagement.

The rise of XR solutions, which use large LED screens to display virtual environments controlled in real time, has made virtual production mainstream. Popularized by The Volume of The Mandalorian (2019) fame, we’ve seen LED screens used increasingly across Hollywood productions, such as The Batman (2022), as well as in the advertising world, as showcased in the Nissan Sentra car commercial. 

Broadcasters are also turning towards XR, with China Central Television and the BBC integrating it into workflows.

The future of media production

There is no doubt that soon, virtual production will be regarded as a standard media production.

Real-time tools are lowering the entry bar for all types of content creators outside of typical production and education environments. Ideation and creation stages are getting shorter, and hardware is becoming less expensive, which has enabled VP to become far more accessible.

We see that demand for new talent is growing with universities and other educational institutions implementing virtual production into their curricula. This is something that we are addressing with the Pixotope Education Program. 

Outside of the broadcast world, use cases will vary from educational content creation, online marketing and sales, to influencers, streamers, and more. Many VFX studios have already started integrating real-time CG into their workflows, and we’re sure to see more independent studios producing content using VP. Particularly exciting for storytellers and viewers is how next-level VP techniques like real-time data input can enhance viewer engagement. 

In terms of other transformative technology, we’ll likely see VP workflows move further into the cloud. The Pixotope Live Cloud, an industry first in VP cloud-based platforms, gives users a pay-as-you-go service that is always available. This enables easy up-scaling and down-scaling, helping to effortlessly create segmented content, such as developing variations of sections that employ different graphics. When anyone can create and control AR graphics live using a cell phone, it offers infinite opportunities for creativity.

But there is still more progress to be made, and VP is far from its endpoint. In the short-term, we will see advancements in hardware and software:

  • Graphics cards will become more powerful.
  • Complexity of systems will be reduced.
  • Calibration and line-up will be simplified across VP workflows, such as for camera tracking in LED stages.

In the long-term, the evolution of AI could see it help VP workflows by acting as a virtual film crew or providing intelligence to AR asset performances and personalities.

We believe virtual production is the future of media production, and there’s no better time to take advantage of this industry shift. While there is plenty to come in the future of virtual production, current innovations and accessibility mean that it is the perfect time to take the leap into using VP in your broadcast or live workflow. 

The post Column: The past and present of virtual production—and where it could go next appeared first on NewscastStudio.

]]>
114675
Column: Virtual production camera tracking and what you need to know https://www.newscaststudio.com/2022/10/26/column-virtual-production-camera-tracking/ Wed, 26 Oct 2022 13:14:41 +0000 https://www.newscaststudio.com/?p=114689 Virtual production is booming, and for good reason. It’s being used more and more by ... Read More

The post Column: Virtual production camera tracking and what you need to know appeared first on NewscastStudio.

]]>
Virtual production is booming, and for good reason. It’s being used more and more by filmmakers, broadcasters, and pretty much anyone who wants to create immersive experiences with real-time virtual content. By combining the flexibility of virtual worlds with the ability to visualize content earlier in the production process, studios are given new possibilities to realize and refine their visions.  

Virtual production relies heavily on real-time camera tracking, which synchronizes live-action footage with the virtual camera so that 3D data and live-action footage can be seamlessly integrated.

In virtual production, there are many methods to create virtual environments in which natural talent can interact. For example: 

  • Augmented reality (AR): Real-time overlaying of virtual objects in real-world environments.
  • Virtual studios (green screen): Allows real objects and real people (actors) to interact in real-time with computer-generated environments and objects seamlessly.
  • Extended reality (XR): Using large LED volumes to create virtual environments where real-life talent can interact.

Regardless of the method you choose for your virtual production project, quality camera tracking is essential for your toolkit. In addition, as processing power and software capabilities have improved, camera tracking has become more affordable and easier to implement. 

So, what is camera tracking and how can you use it in your next project? Well, here’s a breakdown of some options.

Mechanical vs. optical camera tracking

The first thing to note is that camera tracking systems fall into two categories: mechanical and optical. 

With their long history of use in live broadcast AR, mechanical tracking systems are still considered to be dependable and are the mainstay of many sports and live events, especially in larger arenas.

With sensors placed on the camera rig, mechanical systems deliver precise camera movement data: including its position and orientation while additional information from the camera itself details the lens parameters. This ensures that virtual objects appear as part of the real environment, and vice versa, throughout every camera movement.

Thanks to the sensor-based nature of mechanical tracking, you can use these systems in situations where optical tracking isn’t possible. In locations where you cannot place markers, or where there are few identifiable natural markers (see below for more details), mechanical systems are ideal: such as a putting green in golf, which is entirely green.

However, mechanical systems can require significant setup time and they require regular realignment. It is also necessary for the camera rig to have a fixed origin from which the movement of the mechanism can be referenced, limiting flexibility. 

Optical camera tracking systems, on the other hand, can give far more freedom to move the camera around. Cameras with optical tracking can be moved freely; you can even use a steadicam. 

As the tracking sensor typically points upwards or downward rather than into the set, it is unaffected by studio conditions such as lighting configurations, reflections, and plain green screens. Setup can also be relatively quick and easy. Optical systems come in a couple of different flavors.

Marker-based camera tracking

As you probably guessed, marker-based systems rely on markers placed around the scene or, at least, in view of the sensor. They are used as points of reference to calculate the exact position and orientation of the camera.

Markers can take several different forms. One form is a pre-calibrated coded floor, where ‘absolute’ markers remain in the same position. As the markers are already in place on the roll-out floor, no calibration is needed, so setup is quick and easy. 

Using absolute visual markers ensures a clear and absolute reference, quick initialization, and interference-free tracking. They also require a lot less horsepower than markerless systems, or systems that have ‘random’ instead of absolute markers, making them more affordable than alternative options. A pre-calibrated marker floor can be used both indoors and outdoors, so it’s incredibly versatile, and a budget-friendly way to get started with virtual production. 

Alternatively, you can combine absolute markers with carefully placed relative markers to pinpoint the camera’s position with the same camera and sensor setup. 

Countering low, or variable lighting conditions, infrared retroreflective markers reflect infrared (IR) light from the IR lamp onto the sensor, ensuring that the sensor is unaffected by low or changing light, while remaining hidden to the human eye and the main camera itself.

Setup is painless as well–you just need to install the markers and carry out a simple self-calibration process. However, due to the need for individually placed markers, it’s best to keep this method for studio-based productions. 

Markerless camera tracking  

A markerless camera tracking system uses real features in the natural environment instead of artificial markers, which gives it greater flexibility. Using a sensor on the camera, the scene is captured and analyzed in real-time, generating 3D feature points that are then used instead of artificial markers. Once the 3D points are generated, an automated tracking algorithm then calculates the camera’s position and parameters. 

Markerless systems give you freedom for creativity as you can move the camera without restriction as well as being far more practical for use in outdoor or large indoor spaces. 

Markerless camera tracking technology can, in certain scenarios, also be used Through The Lens (TTL) without the need for a sensor. In this case the tracking uses the video stream from the camera to determine the camera’s position in 3D space in real-time. This is incredibly powerful for applications such as drones, wire cams and action cameras where adding sensors and/or markers is impractical. 

Tracking in LED volumes

One of the challenges with Virtual Production using large “wraparound” and multi-plane (floor/ceiling) is that there are very few natural reference points for tracking to pick up on, or anywhere to place markers. In this scenario, we need a different solution.

With the high refresh rate of modern LED volumes, one such solution is to “hide” the tracking markers in the LED volumes themselves. By using absolute marker patterns, and displaying them on the LED volume between the frames captured by the cameras, we can achieve highly accurate and reliable optical tracking. We can also go one step further by also displaying the negative of the tracking markers, meaning that patterns are invisible to the human eye, and therefore not a distraction for on-set talent.

Regardless of your chosen method, camera tracking is an integral part of virtual production workflows. With virtual production continuing to grow in popularity as a means of creating content, it’s imperative that the right tools are available to meet the needs of each production.

The post Column: Virtual production camera tracking and what you need to know appeared first on NewscastStudio.

]]>
114689
Column: New roles in virtual production and how to get them https://www.newscaststudio.com/2022/08/23/virtual-production-jobs/ Tue, 23 Aug 2022 12:30:29 +0000 https://www.newscaststudio.com/?p=113327 From the types of jobs on offer, to what skills and experience you need to ... Read More

The post Column: New roles in virtual production and how to get them appeared first on NewscastStudio.

]]>

Featured Column

View more columns from NewscastStudio contributors

From the types of jobs on offer, to what skills and experience you need to join them, here’s your essential primer in joining the rapidly evolving virtual production industry.

Virtual production (VP) is rapidly becoming a popular approach across film and TV, live events, and immersive media experiences. Not only are there practical benefits – saving both time and cost when compared to expensive location shoots – but it also opens up a whole new world of creative possibilities for any kind of media production.

With the industry rapidly evolving, roles in VP are becoming increasingly in demand, opening up a range of opportunities for both newcomers and established industry professionals. In this article, we’re sharing some examples of key roles in VP and discussing tips on breaking into the industry as a newcomer.

How to get a job in virtual production

With these new roles constantly evolving, it’s clear that there is plenty of opportunity to get involved with the ever-expanding virtual production world. But how do you actually get started with a career in virtual production?

As with any industry, networking is a good place to start. However, it’s also a good idea to familiarize yourself with the virtual production pipeline. Having an understanding of each element of the production, even those you’re not directly involved with – such as video I/O, real-time animation, technical support, and stage skills – will give you an advantage in creating a seamless workflow. And if you work on the creative side, then a showreel is essential for showcasing your work.

Teamwork and communication are particularly important and often undervalued skills for the virtual production environment, says Erik Beaumont, head of mixed reality (MR) at The Famous Group (TFG), a production company specializing in virtual and augmented reality for sports and live events. 

Erik also highlights the advantage of having a traditional production background. “Production timelines and deadlines are stacked slightly differently. We emphasize different things like how the dailies work and cutoff dates for approvals, but if you have the basic toolset, you’ll adjust fairly seamlessly,” says Erik.

He also advises that knowing your way around a real-time engine will put you in a good position, and that this is not as hard as it sounds. “Pick a game engine of choice and just do something in it. People imagine a high threshold of how complicated it is, but the reality is different,” says Erik.

Spanish 3D artist Fernando Trueba Llano, a technical artist at Televisión Canaria, agrees that real-time engines are a good starting point for those who want to get into virtual production but don’t know where to start. 

“My recommendation would be to start studying the basics of 3D and focus on Unreal Engine,” says Fernando. “Dedicate personal time to dig in and learn as much as possible. […] And finally, put into practice what you have learned and create demos of your capabilities with the engine.”

New roles in virtual production

If you’re interested in gaining new skills in VP from a traditional production background, or if you’d like to transfer your experience in gaming, VFX, or real-time engines, here’s a taster of some of the new jobs and skills currently needed in the ever-evolving virtual production industry.

Virtual Production Supervisor

The Virtual Production Supervisor is a leadership role, taking charge of the entire virtual production set. People in this role can expect to act as a bridge between different sections of the crew. This requires strong people skills – and you’ll also need an excellent understanding of the tech involved. This is an ideal role for producers, or those with a background in VFX, such as a VFX supervisor.

Volume Operator

A virtual production volume – otherwise known as an XR volume or VP stage – is made up of many large LED panels that display media content. This can be anything from stills and 2D footage to real-time 3D graphics generated by a game engine.

These volumes need someone to oversee the content piped into these screens, where the Volume Operator comes in. Often working with assistants, depending on the production size, the Volume Operator is usually someone with experience in creating real-time graphics on gaming engines such as Unreal Engine or Unity.

LED Engineer

As well as having someone to operate the volume, virtual productions require someone to manage and maintain the screens themselves. LED panels are a key piece of the rig in a VP environment so it is essential to have an LED Engineer, otherwise known as an LED Technician, on-set to ensure they work correctly and consistently.

LEDs have been used across live events and broadcasts for some time, so many AV technicians will find they already have the required skills to become an LED Engineer. Other relevant skills include experience in motion capture and camera tracking systems, plus knowledge of Unreal Engine and other real-time tools.

VFX Supervisor

In the visual effects industry, VFX supervisors take charge of an entire project, managing the people, workflows, and timelines for delivery. This is not an entirely new role, but has evolved within the virtual production world. 

In VP, a VFX Supervisor is responsible for all the visual effects on a real-time production. This role works across the creative side but also has a solid understanding of the technical aspects involved, especially newer innovations like real-time technology. 

Motion Capture Supervisor 

Another leadership role, this one focuses on motion capture. Like the VFX Supervisor, this role has existed for some time but has now expanded to incorporate the real-time environment. Again, working knowledge of motion capture and the ability to work with camera tracking systems is essential.   

Virtual Camera Operator

As the name suggests, this person operates a virtual camera using an external device such as a tablet. A good understanding of cinematography is important, as well as the ability to keep up to date with all the latest equipment and software on offer. This is also an excellent role for people with experience in creating computer-generated animations using game engines. 

Engine Operator

Real-time game engines, such as Unreal Engine and Unity, are becoming an increasingly essential part of the virtual production pipeline. The Engine Operator is responsible for operating the real-time engine being used in the production.

The post Column: New roles in virtual production and how to get them appeared first on NewscastStudio.

]]>
113327
Disguise acquires Meptik, virtual production creative studio https://www.newscaststudio.com/2022/07/12/disguise-acquires-meptik-virtual-production-creative-studio/ Tue, 12 Jul 2022 09:06:55 +0000 https://www.newscaststudio.com/?p=112729 Disguise has acquired Meptik, the Atlanta-based immersive entertainment agency. Both companies will continue to operate ... Read More

The post Disguise acquires Meptik, virtual production creative studio appeared first on NewscastStudio.

]]>
Disguise has acquired Meptik, the Atlanta-based immersive entertainment agency.

Both companies will continue to operate as separate entities (for the timing being) but the acquisition signals Disguise’s larger move into the world of content creation, having also recently purchased Polygon Labs.

Meptik specializes in virtual production with clients ranging from film studios to lifestyle brands, Fortune 500s and musical artists and is ded by Sarah Linebaugh and Nick Rivero.

Last fall, Meptik helped the “NHL on TNT” launch with a unique, projection-mapping effect on the set’s scenery powered by Disguise. The studio has gone on to win NewscastStudio’s Broadcast Production Award for Studio Technology

“We have always prided ourselves on our ability to service both sides of the xR equation – the creative and the technical – and we love creating dynamic virtual worlds that blow audiences away,” said Nick Rivero, co-founder of Meptik. “Joining forces with disguise will enhance our ability to continue to serve our clients from start to finish while maintaining our down-to-earth spirit.”

“Meptik has been a trusted partner of disguise for many years, growing to excel at bringing out the very best capability for disguise solutions. Given that we have both been, in the past two years, bringing extended reality and immersive entertainment to the world, this acquisition would expand on these efforts and lead the next era for extended reality and metaverse experiences,” said disguise CEO Fernando Küfer.

The full terms of the deal were not released.

The post Disguise acquires Meptik, virtual production creative studio appeared first on NewscastStudio.

]]>
112729