TV News Virtual Set and Studio Design News and Examples https://www.newscaststudio.com/category/tv-news-virtual-set-design/ TV news set design, broadcast design & motion graphics Sat, 10 Jun 2023 00:57:06 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://www.newscaststudio.com/wp-content/uploads/2019/07/cropped-newscaststudio-icon-32x32.jpg TV News Virtual Set and Studio Design News and Examples https://www.newscaststudio.com/category/tv-news-virtual-set-design/ 32 32 46293266 Column: The past and present of virtual production—and where it could go next https://www.newscaststudio.com/2022/11/28/column-where-virtual-production-will-go/ Mon, 28 Nov 2022 12:55:19 +0000 https://www.newscaststudio.com/?p=114675 When you hear the phrase “Virtual Production,” what do you envision? Generally, we think of ... Read More

The post Column: The past and present of virtual production—and where it could go next appeared first on NewscastStudio.

]]>
When you hear the phrase “Virtual Production,” what do you envision?

Generally, we think of it as an amalgamation of real-time production techniques that combine physical subjects with digital “magic” to create eye-catching content.

But while VP has become a buzzword in recent years, it’s not entirely new. It wasn’t always known as “virtual production,” either. In fact, the earliest form of what we now call ‘VP’ happened 75 years ago in British cinema. Production designer David Rawnsley began using back projection in his ‘Independent Frame’ system, pre-planning extensive storyboards and models along with camera and talent movements. While his Independent Frame technology never truly caught on, the principles of virtual production remain the same today; using planned pre-visualization, tracking camera and talent movements, and simulating environments or assets in real time. 

A recent history

However, the history of virtual production that we will focus on begins at the turn of the 20th century. Once digital editing entered the picture, filmmakers could transcode the film to digital media and edit material even while it was still being shot. This eventually evolved into on-set editing, and then into adding VFX from personal desktops. As a result, rigid feedback pipelines became agile, fluid, and accessible in real time.

A glowing blue puck was used on Fox Sports’ live ice hockey broadcast in 1996 to make the puck more visible on screen, marking the first use of augmented reality (AR) during a live sporting event. This ‘FoxTrax’ system was billed by Fox as “the greatest technological breakthrough in the history of sports,” bringing live AR into the homes of millions of sports fans.

Just a couple of years later, in 1998, an NFL live broadcast displayed a yellow line to denote a ‘first down line’ location on the field. This was quietly revolutionary. Before then, sports fans would squabble amongst themselves over whether their team had earned a first down until they heard confirmation from the commentator or referee. With this simple yellow line, the NFL changed the way we watch the game. It offered real-time feedback to viewers about the state of play in a way that not even in-person spectators could see – and marked a turning point for the use of virtual production in live broadcasts.

Later, advancements in filmmaking technologies led to the use of virtual cameras for pre-planning camera moves and simulcams, which superimposed virtual characters over live-action in real-time. Peter Jackson’s The Lord of the Rings: The Fellow of the Ring (2001) and James Cameron’s Avatar (2009) are some of the earliest examples of these VP technologies in Hollywood filmmaking.

Convergence of circumstances

After initial breakthroughs in real-time CG, AR, green-screen, and other live graphics started becoming commonplace in the late 2000s. With advancements in visual and consumer technologies, the public could have AR in their pockets – from Google Maps Live View to Pokémon GO. 

Gaming technologies also improved to the point where graphics started becoming truly immersive. GPUs and graphics cards have become faster and more powerful, and game engines like Unreal Engine and Unity have made significant progress. Released by Epic Games in 1998, Unreal Engine has evolved from being a game rendering technology into a real-time graphics engine that pushes the boundaries of photorealism across industries.

As a result of these technological developments, streamed content, social media networks, and gaming apps emerged as challengers to traditional cable TV, creating a shared social experience – and mainstream content creators had to push their boundaries to keep up.

From the earliest applications of AR in live broadcast, we now see this technology used across all forms of sports broadcasting. In addition to enhancing viewer engagement like these NFL team mascots, graphics can also be used to incentivize brand investments, like this Halo Infinite x Oregon Ducks AR experience.

Virtual studio techniques, such as those used on the Weather Channel, can bring something as simple as a weather report to life. Viewers can be fully immersed in the current weather conditions as presenters are thrust into the middle of a storm. With live 3D graphics overlaid onto a green screen, we can watch the talent interacting in real-time with the virtual environment around them, enhancing our engagement.

The rise of XR solutions, which use large LED screens to display virtual environments controlled in real time, has made virtual production mainstream. Popularized by The Volume of The Mandalorian (2019) fame, we’ve seen LED screens used increasingly across Hollywood productions, such as The Batman (2022), as well as in the advertising world, as showcased in the Nissan Sentra car commercial. 

Broadcasters are also turning towards XR, with China Central Television and the BBC integrating it into workflows.

The future of media production

There is no doubt that soon, virtual production will be regarded as a standard media production.

Real-time tools are lowering the entry bar for all types of content creators outside of typical production and education environments. Ideation and creation stages are getting shorter, and hardware is becoming less expensive, which has enabled VP to become far more accessible.

We see that demand for new talent is growing with universities and other educational institutions implementing virtual production into their curricula. This is something that we are addressing with the Pixotope Education Program. 

Outside of the broadcast world, use cases will vary from educational content creation, online marketing and sales, to influencers, streamers, and more. Many VFX studios have already started integrating real-time CG into their workflows, and we’re sure to see more independent studios producing content using VP. Particularly exciting for storytellers and viewers is how next-level VP techniques like real-time data input can enhance viewer engagement. 

In terms of other transformative technology, we’ll likely see VP workflows move further into the cloud. The Pixotope Live Cloud, an industry first in VP cloud-based platforms, gives users a pay-as-you-go service that is always available. This enables easy up-scaling and down-scaling, helping to effortlessly create segmented content, such as developing variations of sections that employ different graphics. When anyone can create and control AR graphics live using a cell phone, it offers infinite opportunities for creativity.

But there is still more progress to be made, and VP is far from its endpoint. In the short-term, we will see advancements in hardware and software:

  • Graphics cards will become more powerful.
  • Complexity of systems will be reduced.
  • Calibration and line-up will be simplified across VP workflows, such as for camera tracking in LED stages.

In the long-term, the evolution of AI could see it help VP workflows by acting as a virtual film crew or providing intelligence to AR asset performances and personalities.

We believe virtual production is the future of media production, and there’s no better time to take advantage of this industry shift. While there is plenty to come in the future of virtual production, current innovations and accessibility mean that it is the perfect time to take the leap into using VP in your broadcast or live workflow. 

The post Column: The past and present of virtual production—and where it could go next appeared first on NewscastStudio.

]]>
114675
Industry Insights: Where is augmented reality in broadcast headed? https://www.newscaststudio.com/2021/12/14/augmented-reality-broadcasting-workflows-adoption/ Tue, 14 Dec 2021 07:45:29 +0000 https://www.newscaststudio.com/?p=106004 Augmented reality, immersive mixed reality and virtual sets continue to see increased adoption, moving beyond ... Read More

The post Industry Insights: Where is augmented reality in broadcast headed? appeared first on NewscastStudio.

]]>
Augmented reality, immersive mixed reality and virtual sets continue to see increased adoption, moving beyond special events like election night and the Olympics. 

These tools continue to gain realism and ease of use with new workflows and integrations, allowing broadcasters of any size to leverage them. 

In part one of our Industry Insights roundtable, we speak with leaders from the motion graphics production solutions industry along with key design firms about augmented reality’s future, workflows and remote production implications. 

Where are we headed with augmented reality (AR) workflows in broadcast?

“As an industry, the groundwork has been laid for AR productions but now is the time to make this transformative technology accessible to all sizes of productions and budgets. Therefore, we are likely to see AR workflows move to the cloud, alongside many other elements of broadcast,” said Gerhard Lang, CTO for Vizrt.

“In the short term we will continue to see increased adoption in live event and broadcast production as the tools become more accessible to a large user base. In the mid to long term we will see these tools enable new types of media experiences that leverages the online and interactive nature of AR workflows,” said Marcus Blom Brodersen, the CEO of Pixotope.

“There are two ways we see this go. AR will be used as often as any 2D graphics the studios employ today for the general industry. This will be the standard. For those who already are in AR and virtual studio world, we’re headed towards a future where AR and mixed reality are becoming increasingly sophisticated and used in more creative ways than ever before,” answered Onur Gulenc, a territory manager for Zero Density.

“From a workflow perspective, the long ideation and creation stages are getting shorter and shorter whilst the hardware to deliver is getting less expensive, therefore it is becoming far more commonplace than it was even six years ago,” said Paul Jamieson, a VP of creative for AE Live.

“For a long time, companies, including Silver Spoon, were trying to figure out what kind of content could use AR purposefully. With applications like Maya, Cinema 4D, and Houdini significantly improving their integration in-engine, broadcasters are now using their favorite tools as part of their graphics workflow. Instead of finishing them in a traditional render pipeline, they can start to use Unreal Engine and do the same job in real-time,” said Dan Pack, managing director at Silver Spoon

“AR is now a given but expectations in terms of quality are also higher. Having engines like Unreal helps us produce stunning photorealistic effects but the key is to adapt Unreal to broadcast and ‘democratize’ the use of AR for broadcasters and media producers. As such, our focus has always been and will continue to be simplicity,” replied Boromy Ung, senior director of product management for Ross Video.

“The process for bringing AR on air is accelerating and becoming less cumbersome with the advent of cloud-based AR platforms. When segment producers and other non-designers are empowered to create AR graphics and presenters can control the graphics live, using a cell phone, it completely transforms the workflows,” said Yaron Zakai-Or, co-founder and CEO at Arti.

“Virtual and AR technology is now mature enough to represent a significant part of the content creation portfolio of most production companies, including broadcasters, due to the changes the digital age has driven on how content is produced and consumed. The integration of virtual production in broadcast automation is increasingly important, so the production equipment that creates such content are no longer islands and become part of a larger, integrated and collaborative workflow,” said Miguel Churruca, marketing and communications director for Brainstorm.

“Augmented reality has begun to morph into many different types of broadcast integration. As the quality of the software-based render engine’s capacity has increased and the hardware as well in the form of graphics cards and CPU’s, we have seen the ability for broadcasters to create set-dressing without the need for physical builds to achieve realistic integrations. This is now commonplace on shows like SportsCenter when they simulate set monitoring with AR elements. This has led the way for new terms like Mixed Reality and Extended Reality to take hold in our industry,” said Nathan Marsch, creative and managing director of Girraphic

What has changed in the past few years?

“In a word…Everything. Epic and the Unreal Engine has taken our industry by storm. As the talent pool of designers and technical artists increases in this software platform, we are seeing a shift in the photorealistic quality capable in broadcasts. The team at Epic has been busy integrating and partnering with broadcasters and creative companies and continues to lead the industry in research and development of new products for film, television, broadcast, interactive & experiential displays,” said Marsh.

“Over the past few years, real-time engines have become an integral part of studios’ pipelines, and are no longer just an add-on or afterthought. We started as a VFX studio, relying on traditional pipelines, but now, using Unreal is our first option. It allows us to iterate quicker, deliver content quicker, and it allows us to be more collaborative because we can see things all at once – real-time pipelines are now a necessary component in the creative industries,” said Pack.

“AR has become mainstream and is present in nearly all major broadcast events. The use of AR in the 2016 US elections was a milestone with all key networks utilizing AR elements as a storytelling tool. With the growing demand for AR, the solutions became cheaper and easier to set up. Game engines like Unreal became a more mainstream part of major productions as well. Tracking systems became smaller and easier to set up too,” Lang noted.

“From a technology perspective, the quality and availability of game engine technology, GPUs cloud infrastructure, LED technologies, video over IP and modern software development methodology has greatly increased the development and adoption of Virtual Production. Combined with the pandemic and heavy pressure on traditional broadcast to reinvent their content, has created a perfect storm for Virtual production growth,” Brodersen said.

“Hardware advances like Nvidia’s DLSS and ray-tracing have made CG elements more realistic than ever. The hardware is catching up with the software: With the right setup, you can now have hyperreal graphics on live broadcasts in UHD, all without any dropped frames,” said Gulenc.

“I think we’re now at the stage that it has become far more commonplace and not just from a budgetary or space perspective. Producers have embraced this tech and its almost limitless possibilities and there’s a genuine desire now from production and creative teams to push the boundaries whereas previously it may have been more the technology teams pushing this innovation,” Jamieson told us.

“The big thing that has happened in the past couple of years is the rise of virtual LED environments. Obviously, the big studios have been making extensive use of LED environments especially in the context of coronavirus as this allows them to reproduce a particular scenery without having to be physically on-site. But we see this now gaining a lot of traction with broadcasters as well, as this allows the talent to visualize the virtual environment they are supposed to be interacting with while still providing the same benefits as virtual sets particularly around production flexibility,” Ung said.

“The technology has been democratized—it’s easier to use, more affordable, and more flexible than ever. At the same time, audiences have raised their expectations for on-air graphics, since they’re used to seeing AR in their social apps, video games, etc. It’s really a perfect storm, or a convergence,” stated Zakai-Or.

“The pandemic has consolidated and accelerated the virtual technology revolution. Virtual and remote solutions are here to stay, so taking advantage of these will only help in satisfying an ever-growing demand for content. The traditional way of doing business in television has been seriously impacted by the pandemic with production dropping all around the world, travel restrictions applied, and many other related issues. However, the situation also rose opportunities for any kind of virtual production, from remote shooting to virtual events,” Churruca said.

“The industries of live events, trade shows, concerts, sport, and election broadcasts have all begun to blend together as the creative minds in our industry continue to push the envelope and shatter barriers. Couple that with the digital and NDI revolutions and we are looking at multiple content delivery platforms working in unison and technology, software and hardware that is interchangeable. Onward to the metaverse we charge. Look no further than Facebook changing their company name to Meta to lend credence to that fast-approaching reality. Don’t get me started on the potential use cases for NFTs,” added Marsh.

How has remote production (largely driven by the pandemic) changed your thinking on AR or virtual?

“Remote access through the different solutions has become so efficient that the work experience doesn’t differ much whether you are on-site or logged in remotely. Assisting customers during setup remotely is also easier. For monitoring purposes, the on-prem signal can be converted to an IP format which can be transmitted with low latency and exceptional quality. Using NDI 5 is great for all types of monitoring, with the quality allowing productions to even do quality control assessments on the keying. We’ve also been able to utilize AR technology in remote productions to teleport interview partners into studios,” said Lang.

“We’re seeing a clear trend to centralize production and the need to rationalize the operations of the production centers, leading to adoption of video over IP and virtualized infrastructure. Virtual production, being virtualized by nature, lends itself especially well to being deployed in this type of environment, and offers the possibility to be fully automated and connected to remote camera operations,” explained Brodersen.

“Broadcasters were already starting to use AR and virtual graphics before the pandemic: the coronavirus just sped everything up. The pandemic has democratized AR and virtual graphics, making them more accessible throughout the industry. Broadcasters can now even control all their virtual studio graphics directly from an iPhone,” said Gulenc.

“We were fortunate in that we had been working in a remote production environment in other regions for some time ahead of the pandemic. This served us well as our applications were optimized to work across a remote workflow. For example we regularly reproduce multiple in-game AR feeds downstream by taking time accurate camera data back to the remote hub,” Jamieson said.

“REMI workflows, referring to a remote integration model, allow our machines to stay on-site while our operators are sitting back in New York. AR teams are highly specialized, so instead of sending four groups made up of an operator, technician, and Unreal Engine artist to multiple locations in the country, it’s possible to have everybody in a central location. There’s next to no latency because the on-site machines are doing the rendering – we can actually be more efficient because we don’t need to travel, completing projects back-to-back,” Pack noted.

“AR is just one element of the entire production chain so to a certain extent, working in a remote production context is no different than doing the same with the rest of the production chain. Remote control and monitoring capabilities are supported as they are with the rest of the Ross ecosystem,” said Ung.

“The impressive quick pivot to remote production really confirmed what we had been seeing before the pandemic—namely that broadcasters want and need the flexibility to produce high-quality AR both inside and outside the studio. They no longer want to be tethered to a server room or solely reliant on designers with specific AR expertise. They’re used to more nimble workflows now with cloud production, and they aren’t looking back,” Zakai-Or said.

“I wouldn’t say that the pandemic itself has changed perspective on virtual usage cases for us as a company as much as it has for production in general across a multitude of industries,” said Marsh. “Virtual production tools and Volumetric LED stages has gone absolutely crazy, and everyone is racing to get ahead of the curve from Lux Machina to MGM and their construction of The Sphere amongst many other companies… No longer do producers and directors have to deal with outside elements to create beautiful cinematic shots and the immediate gratification of virtual production gives way to what you see is what you get realm for high budget shoots. When you can look at the composited vision immediately on set and in-camera and manipulate the environment, lighting, actors and set elements in a controlled manner without having to restage a physical shoot it becomes clear that this is the only way forward.”

“Our users have been creating advanced AR and virtual content for decades, and the global increase in the quantity and quality of AR and virtual content produced by content creators of any kind confirms that our vision was correct and encourages our company to continue delivering the most advanced technologies to comply with any client’s requirements, no matter how complex they may be,” Churruca responded.

“As far as remote production and the pandemic in relation to our company, what it did for us was give us a locked-down sandbox for a season of sport in the ISL where we were allowed to hone in our implementation without the logistics of setup and teardown after each event. At the ISL event in Budapest, we were quarantined between a hotel and the venue for 9 weeks with days off in between events with a fully rigged event space. When we coupled that with the boredom of being locked down what we saw was an acceleration of ideas and technical integrations surrounding the use of augmented reality as a bridge between races and in the broadcast in general. This was far and away the most rapidly advancing broadcast integration we ever had the chance of working on as a company,” added Marsh.

Participants

Gerhard Lang, Vizrt
Marcus Blom Brodersen, Pixotope
Onur Gulenc, Zero Density
Paul Jamieson, AE Live
Dan Pack, Silver Spoon
Boromy Ung, Ross Video
Yaron Zakai-Or, Arti
Miguel Churruca, Brainstorm
Nathan Marsh, Girraphic

The post Industry Insights: Where is augmented reality in broadcast headed? appeared first on NewscastStudio.

]]>
106004
Column: Virtual production is the future. Here’s why. https://www.newscaststudio.com/2021/03/22/virtual-production-is-the-future/ Mon, 22 Mar 2021 12:45:39 +0000 https://www.newscaststudio.com/?p=98980 Everyone’s talking about virtual production these days, and for good reason – it removes all ... Read More

The post Column: Virtual production is the future. Here’s why. appeared first on NewscastStudio.

]]>
Everyone’s talking about virtual production these days, and for good reason – it removes all the limits broadcasters used to face. Recently, Deloitte released a report on why virtual production is the future of content creation, breaking down the drivers of its explosive growth into four main parts. In this article, we’ll explore some of these ideas and share what we’ve learned as a team that’s been working with virtual production workflows longer than most.

The Popularity of CG, AR and visual effects

When you look at the Top 20 highest-grossing films of the last decade, the vast majority are loaded with visual effects. It’s what viewers want. But VFX is no longer limited to Hollywood blockbusters. Just look at the success that broadcasters have seen in bringing major events to life in-studio, from breathtaking storms to the recent Mars landing of Perseverance. Beyond the fact that these attention-grabbing segments look cool, visual effects can also help build more informative and relevant news broadcasts.

Studies have shown that TV viewing time continues to decline each year. But we have also seen evidence that viewers would tune in more often if they knew they were going to see immersive effects on their screens. By bringing AR and CG into your broadcasts, you can do everything from putting viewers in the driver’s seat at a Formula E race to taking them to the epicenter of an earthquake. Visual effects aren’t just for film and TV anymore, and broadcasters that don’t embrace these opportunities are certain to drive their viewers elsewhere.

The rise of the virtual studio

Even before everyone was talking about virtual production, digital sets were starting to dominate broadcast and live TV. We have a customer in Latin America that produces eight programs a day, each with a different set, from the same physical space. Switching out the digital set is so quick and easy, it takes place during the ad break at the top of each hour.

Despite the prevalence of virtual sets, we are still in the “Wild West” stage of virtual production. Every broadcaster has a different workflow, so there is no one-size-fits-all solution. The key is flexibility. And as more broadcasters tune in to the power of virtual production, we will start to see a democratization of related tech as new tools are developed that allow teams to produce high-quality content at a faster pace.

Take LED screens, which have become something of a buzzword since their use on The Mandalorian. This tech allows photoreal digital content to appear while filming, dramatically reducing the need for green screens and enabling news teams to create content with better, more informed storytelling at its heart. Most organizations we’ve spoken to can easily imagine the flexibility of a set like this, but right now they are cost prohibitive for many. As the tech becomes cheaper and more accessible, we will see smaller LED setups and virtual stages that can deliver a similar level of quality, opening up new opportunities for teams of all sizes.

Increasing accessibility of game engines

The enormous impact of Unreal Engine was clear even before the pandemic hit. While the engine may have been designed for games, it allows broadcasters to do so much more on air – not just in terms of incorporating photoreal content, but in how it encourages creativity and enables teams to dream up new solutions to old problems.

Last year Epic showed off a teaser of Unreal Engine 5, which will add greater real-time capabilities and better image quality, making the real and the virtual indistinguishable. Combined with camera tracking tech like Ncam, crews can see exactly how both on-air talent and CG elements are positioned within a virtual space, allowing them to quickly iterate and adjust as needed.

This is a massive industry driver, and the Epic team has made it a focus to support these efforts including expanding their toolsets and investing heavily in the MegaGrants program, which empowers companies to create the next generation of real-time production tools. Every major graphics provider has already built their own Unreal plugin, because they know if they don’t they’ll be left behind.

COVID-19 and new trends in safety

Prior to the pandemic some industry trends were already underway, including the growth of remote collaboration. But the onset of COVID-19 rapidly changed these from “nice to haves” to “absolute necessities.” Virtual production is next on the list.

The main forces driving the adoption of virtual production were always going to be flexibility and efficiency. COVID-19 simply added more layers, since productions now also need to think about things like reducing the number of people on site and making sure projects run safely, while still being cost effective. It’s a delicate balancing act.

Ultimately, it is impossible to say no to increased flexibility, cost savings and creative freedom. We believe the shift to virtual production workflows was destined to happen; the question was simply: “When, and to what degree?” We have the answer to the first part, which is “now.” The answer to the second part is still to be determined, but those who are willing to experiment and think beyond the old boundaries are the ones who will come out on top.

The post Column: Virtual production is the future. Here’s why. appeared first on NewscastStudio.

]]>
98980
Arabic networks focus on US election coverage with augmented reality https://www.newscaststudio.com/2020/11/04/arabic-us-election-augmented-reality/ Wed, 04 Nov 2020 19:50:55 +0000 https://www.newscaststudio.com/?p=96362 Augmented reality and immersive environments were the centerpieces of U.S. election night coverage on many ... Read More

The post Arabic networks focus on US election coverage with augmented reality appeared first on NewscastStudio.

]]>
Augmented reality and immersive environments were the centerpieces of U.S. election night coverage on many Arabic broadcast networks.

MBC Group’s Al Arabiya (العربية) transformed the lake and sky outside of its Media City building in Dubai into a larger-than-life replica of the National Mall from Washington, D.C.

“The sky is not the limit anymore,” notes Fadi Radi, Al Arabiya’s director of creative. “The idea was to create a studio which has Washington, D.C. buildings including the White House and the Congress on the set.”

The network, of course, is no stranger to augmented reality, having an entire internal team dedicated to creating topical elements. For this year, the augmented reality footprint has doubled from the coverage 2016 United States general election, which also used large-format displays.

Flying above the lake was a Robycam robotic camera from Movicom along with two stYpe cranes equipped with camera tracking.

Vizrt tools including the Viz Engine and Viz Virtual Studio drove the graphics, with data coming from the AP via Viz Pilot.

In Doha, Al Jazeera Media Network’s channels, including Al Jazeera English, Arabic, Mubasher and Balkans, also leveraged Vizrt tools powered by the Astucemedia data platform with real-time results.

Along with video wall graphics, Al Jazeera used a virtual set across its networks to breakdown data with augmented reality graphics layered on top. Ahead of the election, for example, this was used to explain the idea of battleground states to an international audience, while on election night the immersive environment pivoted to exit polls and election returns.

Sky News Arabia (سكاي نيوز عربية‎), meanwhile, mixed virtual environments with virtual set extensions from its Abu Dhabi headquarters.

Powered by Zero Density’s Reality platform and rendered in the Unreal Engine, the network’s virtual studio placed the presenter on the lawn of the White House inside of a glass dome.

Inside of this world, a tablet and touchscreen monitor allowed the presenter to break down data which was also layered in front.

For interviews, a large monitor also appeared in the virtual world.

In the network’s physical studio, the network’s large LED video wall and floor were augmented with election branding.

The post Arabic networks focus on US election coverage with augmented reality appeared first on NewscastStudio.

]]>
96362
Case Study: How mixed reality set the tone for TF1’s live 2020 municipal election coverage https://www.newscaststudio.com/2020/10/15/mixed-reality-tf1s-live-2020-municipal-election/ Thu, 15 Oct 2020 14:25:08 +0000 https://www.newscaststudio.com/?p=95915 The Future Group’s mixed-reality platform, Pixotope, enabled French broadcaster TF1 to present innovative and captivating ... Read More

The post Case Study: How mixed reality set the tone for TF1’s live 2020 municipal election coverage appeared first on NewscastStudio.

]]>
The Future Group’s mixed-reality platform, Pixotope, enabled French broadcaster TF1 to present innovative and captivating live coverage of the French municipal elections. Imaginative design allowed the implementation of augmented digital scenery without the need for a large or green-screen studio.

One of the unique challenges involved with this setup was that the broadcaster was not shooting in a studio with a green screen, but in a modestly sized newsroom. The creative idea was to make the newsroom appear as if situated next to the local town hall, which was created digitally. To bond the two scenes together, a freely moving, crane-mounted camera seamlessly drifted between newsroom and town hall. Since there was no green screen, the join between the two was generated by Pixotope using matching 3D digital objects and highly accurate camera tracking.

TF1’s Artistic Director Yoann Saillon explains, “Our main objective was to create informative and photo-real digital segments that would seamlessly flow between a real-life environment and a virtual outside environment and give the viewer a new experience. We had yet to exercise the full power of The Future Group’s Pixotope mixed reality platform, which we’d recently acquired from graphics integrator Post Logic and we felt this live election production would be ideally suited for it.”

TF1 is experienced at incorporating high-quality graphics and mixed reality components into many of their live broadcasts and have previously used Avid’s HDVG system. Lionel Barbier, editorial technical manager from the Artistic Direction for TF1 news reveals, “We were keen to move to technology that was based on the Unreal game engine for the quality and performance it offers. We found that Pixotope made the perfect bridge between the needs of live broadcasters, like us, and the abilities of Unreal.”

Barbier at TF1 continues, “Our aim was to use this new technology within a newsroom setting to achieve a more genuine feel, whilst still using engaging visuals. As the election is a serious topic, it was imperative that our coverage adhered to a real-world tone. We wanted to avoid using a green screen studio as we felt this would appear to distance the presenter from reality.”

TF1’s designers created an entire town hall scene from photographs, extruded to add depth. Other elements were added to the night-time scene such as street lamps and the floor of the square. The completed scene was then loaded into Pixotope, ready to be rendered out in real-time from the viewpoint of a virtual camera which accurately tracked the position and angle of the newsroom camera. Augmented reality graphic panels were also added to present election details such as information about the candidates. Reflection maps and post-effects such as bloom on bright lights added realism.

The output of the virtual scene needed to be masked where it fell behind foreground objects in the newsroom. The Future Group’s Chief Creative Officer, Øystein Larsen explains how this was achieved without a green screen. “A scene inside Pixotope contains objects that make up a three-dimensional digital world. Some of those objects can be set as “hold out masks” which means they are not imaged by the virtual camera but instead create masks for the rendered output. For TF1, the newsroom pillars, ceiling and floor were accurately measured, modelled, and inserted into the Pixotope scene along with the town hall. When rendered from a viewpoint which matched the newsroom camera, the mask objects ensured the Pixotope scene appeared behind real foreground objects, no matter where the camera was.”

Another Pixotope feature used to great effect was the ability to “unlock” the virtual camera from the tracked position of the real camera. This enabled TF1 to create seamless shots that began in the newsroom environment, continued into and around the virtual town hall scene, before returning to the newsroom, in one, single, continuous camera move. The effect was as if the camera crane had no physical limits.

Pixotope wasn’t only used to create the virtual reality town hall scene, but also the augmented reality information graphics, such as totems and a 3D map with which the presenter interacted to demonstrate the unfolding election results, again impeccably matched to the freely moving newsroom camera.

Emmanuelle Jais, broadcast manager at graphics integrator Post Logic, adds, “This project is a great example of how Pixotope can transform even modest filming settings into large-scale virtual scenes. Producers do not need to have their creativity limited by physical space.”

TF1’s Saillon concludes, “The program was extremely successful, directly watched by 5.3 million people and generating significant social media engagement. We are sincerely grateful to the teams at The Future Group and Post Logic, who were always with us either directly or (due to Covid-19) remotely. Their support helped us achieve our goals and to deliver a dynamic and eye-catching program from a newsroom environment.”

The post Case Study: How mixed reality set the tone for TF1’s live 2020 municipal election coverage appeared first on NewscastStudio.

]]>
95915
Column: How broadcasters can save money with virtual sets https://www.newscaststudio.com/2020/09/18/saving-money-virtual-sets/ Fri, 18 Sep 2020 11:53:02 +0000 https://www.newscaststudio.com/?p=94777 One of the most important factors for any broadcaster to deal with is constant downward ... Read More

The post Column: How broadcasters can save money with virtual sets appeared first on NewscastStudio.

]]>

Featured Column

View more columns from NewscastStudio contributors

One of the most important factors for any broadcaster to deal with is constant downward pressure on budgets. As technology advances, everyone throughout an entire broadcast operation is continually tasked to do less with more. And in an industry haunted by the ghost of cord-cutting and looking at the rise of powerful new SVOD players as it restructures itself, any savings that can be made need to be made. It’s often a matter of it being better to choose to do it and having some control over the process than being forced to do it and have none.

This was an important theme across the industry before the coronavirus pandemic hit. It’s one marked extremely urgent now.

Luckily one of the areas where significant savings can be made is with the introduction of the virtual set. And one of the reasons they are starting to appeal to more and more broadcasters now is that they have become genuinely photorealistic.

For instance, our technology was used by Sony Innovation Studios to create a virtual version of the ‘Shark Tank’ set for Sony Pictures Television last season when pressure at the lot in Culver City meant that there wasn’t enough space for its usual two identical stages. The virtual set was used for filming around 100 exit interviews in front of green screen and was indistinguishable from the real world one to the extent that one producer in the production truck asked for a plant to be moved next to a couch, assuming they were watching the real-world feed.

As well as making things possible though, they make them cheaper. Good sets are expensive items to build and operate, especially as they have to withstand the rigors of examination at increasing resolutions. They need to be designed to fit constrained spaces with an eye on material costs; the design needs to be locked down early in the process; they need to be physically built; they need to be installed; they need to be lit; they need to be struck down; they need to be repaired; they need to be stored. And all the time this is happening the studio or sound stage is blocked from all other use.

The virtual set has none of these drawbacks. We’ve mentioned before one client we work with in Buenos Aries, Argentina that runs eight different programs a day from exactly the same studio. At the top of the hour one presenter walks out of the studio, another walks in, a button is pressed, and it’s a completely different space.

The cost savings are impressive. To pull the same sort of operation off in a real-world environment, you would need at least two fully crewed studios, and you would need eight physical sets, which even if they shared many elements would still need to be moved into new positions. Redesigning of even one of the sets would be a difficult task as all eight are so closely interwoven. The studios need to be heated, lit, cleaned… you get the picture.

What’s more though is that the virtual studios are also becoming increasingly easy to use as well. One of the UK’s major broadcasters, Sky Sports, is the rights holder for much of the live soccer shown in the country and decided it wanted to set up its own studio set at every single one of its 90 fixtures involving Premier League teams. Using Ncam’s markerless tracking system, they are able to arrive at a stadium, head to their assigned space, and have exactly the same studio design up and running within an hour, irrespective of location.

This is the sort of project that would have been prohibitively expensive in the real world, and, in fact, probably impossible given the different sizes of the suitable spaces at 90 different stadia.

Even for virtual set technology, it would have been difficult only a fairly short time ago. But markerless tracking is making the virtual set much more adaptable and a lot less finicky to use. Set up times are becoming quicker than ever, virtual set photorealism is set to take a further jump next year where Unreal Engine 5 is introduced, and technologies such as photogrammatic capture are enabling the recreation of real-world spaces inside the virtual quicker than ever.

There are other benefits to virtual sets as well. Alongside unleashing set designers’ ability to really max out their creative urges, they result in a reduced environmental footprint, allow a broadcaster to quickly and simply create consistent looks across their output, and are adaptive and endlessly iterative.

But these are very much the icing on the virtual cake. When it comes down to it, virtual sets are cheaper than real ones, and that is the bottom-line argument. It just so happens that nowadays they’re better too.

The post Column: How broadcasters can save money with virtual sets appeared first on NewscastStudio.

]]>
94777
Column: Why 2020 is the year of the virtual studio https://www.newscaststudio.com/2020/06/26/2020-virtual-studio-production/ Fri, 26 Jun 2020 15:44:46 +0000 https://www.newscaststudio.com/?p=94067 It has been a strange year in the broadcast industry for any number of reasons, ... Read More

The post Column: Why 2020 is the year of the virtual studio appeared first on NewscastStudio.

]]>

Featured Column

View more columns from NewscastStudio contributors

It has been a strange year in the broadcast industry for any number of reasons, most of which are due to the coronavirus. But one of the meta trends that us and a lot of other people have noticed is that the pandemic has accelerated the adoption of several technologies that were already starting to ramp up in an interesting way. Remote production, whether for live events or in post-production is one of them (Netflix alone is posting somewhere around 200 shows using collaborative remote workflows), and another is the continued rise of the virtual studio.

The virtual studio is a technology whose time has very much come in 2020. It is in many respects one of the socially distanced production tools that has been necessary to keep the industry going, allowing sets to be changed at the click of a button, ensuring staff don’t have to move around anymore than strictly necessary, and seamlessly hosting talent both on-set and dialing in from remote locations, even home set-ups.

The coronavirus has played its part, but virtual studios were already firmly on the agenda of many more news organizations than ever before already, even before the lockdown orders started to roll out.

Simply put, they let you do more with less. We have one client we work with in Buenos Aries, Argentina that runs eight different programs a day from exactly the same studio. Changing over is simply a matter of one presenter team walking out, a button being pressed, and another presenter team walking in. It can happen in the commercial break at the top of the hour. You would need at least two fully-equipped studios and a lot more labor to replicate that in the physical world, with all the moving of furniture and set elements, repositioning of cameras and so on that is required.

In many ways it is the epitome of all that was outdated with the CapEx model of broadcast costs; fixed assets sitting in a single location that can only produce one show at a time and having an unpleasant domino effect with costs down the line.

You would also need a proper-sized studio space, replete with lighting rigs and high ceilings. One of the keys to the appeal of the virtual studio is that they can transform rooms not much bigger than a normal office space into a viable space for production. As long as you can accommodate the set elements in the space that the talent needs to work in and the cameras to film it, the rest is very much up to your imagination and the limitations of the technology.

This is a key point, because one of the main drives behind this new wave of interest in virtual studios is that they are technically better than ever. And the main reason for that is down to the development of firstly extremely effective tracking solutions such as ours that will work in a variety of environments with swift calibration and then the widespread adoption of Unreal Engine in many leading virtual studio solutions.

Unreal has literally been a game-changer and taken what was effectively a cottage industry making bespoke graphics and hitched it to a multimedia behemoth that is responsible for a good percentage of the top-end graphics in films, games, and television — certainly all the real-time content. It’s the CG equivalent of the move from proprietary black boxes we’ve seen in broadcast as a whole to IT and even COTS hardware across the rest of the industry and allows us to piggyback on developmental work that is taking place worldwide and being driven by a multitude of sectors. The result is that quality has improved massively and prices have fallen dramatically.

The jump from Unreal Engine 3 to the current Unreal Engine 4 delivered close to undetectable real-time virtual environments. Indeed, often the only thing that does give them away is the presence of stylized graphic elements associated with advertisers or sponsors. It also did this at a price point — Unreal is effectively free — that made virtual studio an attractive proposition for Tier 2 and even Tier 3 broadcasters, democratizing the technology for a whole section of the industry that had never used it before.

There is more of this to come too. The graphics are getting better all the time (Epic has just announced Unreal Engine 5), bringing ever more realistic sets into the range of even the tightest production budget. 2020 could well be the pivot point where the discussion across news organizations and elsewhere is less about whether to use virtual studio or not, but what would be the reasons not to.

The post Column: Why 2020 is the year of the virtual studio appeared first on NewscastStudio.

]]>
94067
Telia uses virtual studio to create an ‘icy’ stadium-like environment https://www.newscaststudio.com/2019/02/22/liiga-hockey-virtual-set/ Fri, 22 Feb 2019 10:30:26 +0000 https://www.newscaststudio.com/?p=79760 Finnish network Telia’s coverage of the Liiga ice hockey league included virtual set technology powered ... Read More

The post Telia uses virtual studio to create an ‘icy’ stadium-like environment appeared first on NewscastStudio.

]]>
Finnish network Telia’s coverage of the Liiga ice hockey league included virtual set technology powered by Zero Density that created an environment inspired by ice hockey stadiums.

Zero Density worked with teams from Streamteam and Broadcast Solutions Finland on the virtual studio setup, which is located in Helsinki, Finland.

Dreamwall provided the virtual 3D design for the set, with Zero Density’s Reality software controlling the output with Epic Game’s Unreal Engine rendering the look.

The virtual set is inspired by ice — with artfully textured floor that mirrors the look of an ice rink — complete with the Liiga league logo “under” the ice.

The space also features blocky columns that could be either internally lit glass or ice — along with a virtual LED ribbon similar to those found in stadiums around the world.

Additional elements include a curved desk as well as virtual video panels that show an oversized, icy version of the Liiga championship trophy. The virtual set also includes both light and dark structural elements mimicking stadium design, with virtual light fixtures on the floor and suspended from a “grid” above.

In addition to the virtual set itself, the look can be “augmented” with graphics that appear to “float” behind the in-studio talent. Many of these graphics feature layers of semitransparent polygons.

Telia covers 450 ice hockey games every season, with the Helsinki production facility serving as a hub that’s linked to 14 venues across the country. 

At any given time, the production hub can broadcast from seven locations simultaneously.

In the virtual studio, Zero Density’s Reality software supplied the photo-realistic virtual studio set, with a Telemetrics robotic and tracking system used to align video signals with graphics.

The hub is equipped with four Reality Engines realtime node-based compositor which enables post-production style visual effects during live broadcasts.

In addition to the virtual set, Zero Density also powered a “teleportation” feature where a player was beamed into the studio for a live interview, appearing next to the talent. 

The post Telia uses virtual studio to create an ‘icy’ stadium-like environment appeared first on NewscastStudio.

]]>
79760
Network takes an already big set and ‘expands’ it in a big way using augmented, immersive mixed reality https://www.newscaststudio.com/2018/12/18/al-arabiya-ar-imr-budget/ Tue, 18 Dec 2018 20:59:42 +0000 https://www.newscaststudio.com/?p=77146 To help explain the new Saudi government budget for 2019, Dubai based Al Arabya decided ... Read More

The post Network takes an already big set and ‘expands’ it in a big way using augmented, immersive mixed reality appeared first on NewscastStudio.

]]>
To help explain the new Saudi government budget for 2019, Dubai based Al Arabya decided its sprawling sleek, modern studio wasn’t big enough, so it turned to augmented and immersive mixed reality to go big — really big.

As the segment begins, viewers were already treated to an impressive visual view of the studio as the presenter walks around it to a riser in front of a wide video wall.

That’s when things went really big.

The camera pulled back and, with a science-fiction like “transporting” effect, the area around the video wall and riser magically filled in with what appears to be a giant atrium and lobby area, with futurist architecture “inspired by” what it buildings might look like in 2030, according to the network.

Al Arabiya is leveraging Vizrt software and Stype RedSpy camera tracking systems to achieve the look.

The shot goes so wide the anchor and the set she was standing on ended up looking almost like a dollhouse in comparison to the huge virtual space. 

The wall of the virtual vertical space was then used to insert key figures surrounding the budget data.

The post Network takes an already big set and ‘expands’ it in a big way using augmented, immersive mixed reality appeared first on NewscastStudio.

]]>
77146
Industry Insights: Augmented reality adoption by broadcasters https://www.newscaststudio.com/2018/08/07/augmented-reality-adoption-by-broadcasters/ Tue, 07 Aug 2018 18:10:28 +0000 https://www.newscaststudio.com/?p=70321 As part of our special Industry Insights series, we recently had a chance to virtually ... Read More

The post Industry Insights: Augmented reality adoption by broadcasters appeared first on NewscastStudio.

]]>
As part of our special Industry Insights series, we recently had a chance to virtually convene a roundtable of broadcast solutions providers to discuss augmented and virtual reality in broadcast.

We’ll share more of their responses as part of our Focus on Augmented and Virtual Reality.

This first edition sets the scene and delves into adoption and how AR can help stations generate extra revenue. Future editions talk about workflows, budgets and best practices. 

Where are we on the AR/VR adoption curve?

“We’re definitely on the front end of the adoption curve for VR. This is mainly due to the expense of HMDs (which is coming down), the growing number of fragmented platforms, and the lack of compelling content, which can be costly to produce. All of these factors are contributing to the slow rate of growth in VR. You could make the case that AR is further behind VR,” said Ray Thompson, Director of Broadcast & Media Solutions at Avid. “While social platforms like Snapchat have added some basic levels of AR, and Pokémon Go has helped raise AR’s profile, we primarily see AR being used in broadcast as a way to augment and enhance storytelling for news, weather, and sports.”

“AR/VR has a long history in broadcast, and Brainstorm customers have used it as early as 1995 to display Elections and other complex data in a visually attractive manner. This approach is still valid, and current hardware technology allows for increased usage of such resource to improve storytelling, meaning broadcasters and content providers are looking into more ways to implement AR within their programs,” Miguel Churruca, Brainstorm’s Marketing and Communications Director, told us.

“If 100% adoption curve means that every local market TV station is using AR to the extent of regular graphics such as graphics and lower thirds, then we are 40% there,” said Melanie Crandall from Vizrt.

“We’re there! Everyone I speak to wants it,” responded Full Mental Jacket’s Ronen Lasry.

“I think it’s a little soon to expect mass adoption of AR/VR solutions, but we are seeing progress. The availability of flexible and reliable optical tracking systems like the Mo-SyS Star Tracker, together with the recent publicity around the use of the Unreal engine in AR/VR applications, is beginning to have a real impact,” Olivier Cohen, Senior Product Manager at ChyronHego.

“Leveraging data feeds, we can enable broadcasters to better engage viewers regardless of what platform they’re on and tell more compelling stories with in-studio and on-field graphics that require fewer resources and lower costs. Combined with virtual sets that allow broadcasters to change the look and feel of a set easily and at a lower cost, Avid is helping our customers deliver engaging content, ” added Thompson. 

The post Industry Insights: Augmented reality adoption by broadcasters appeared first on NewscastStudio.

]]>
70321