sports augmented reality News for Broadcast Professionals https://www.newscaststudio.com/tag/sports-augmented-reality/ TV news set design, broadcast design & motion graphics Sat, 10 Jun 2023 00:56:54 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://www.newscaststudio.com/wp-content/uploads/2019/07/cropped-newscaststudio-icon-32x32.jpg sports augmented reality News for Broadcast Professionals https://www.newscaststudio.com/tag/sports-augmented-reality/ 32 32 46293266 Column: 6 reasons to consider using virtual production https://www.newscaststudio.com/2022/04/28/column-6-reasons-to-use-virtual-production/ Thu, 28 Apr 2022 13:37:02 +0000 https://www.newscaststudio.com/?p=111311 The use of virtual production (VP) in media and entertainment is, according to Statista, set ... Read More

The post Column: 6 reasons to consider using virtual production appeared first on NewscastStudio.

]]>

Featured Column

View more columns from NewscastStudio contributors

The use of virtual production (VP) in media and entertainment is, according to Statista, set to almost triple by 2028, rising by 15% every year until then. Content producers around the world are actively exploring how to leverage modern VP tools and workflows. VP stages are being built across the globe and audiences are responding to new formats and content.

More than just the latest bandwagon to jump on, virtual production is beginning to show its true value to creators, audiences and marketers.

So what are we learning?

Real-time technology enables limitless creativity

The main components of virtual production is the combination of live-action footage with real-time visual effects (VFX). A use case would be augmented reality (AR), where the graphics are overlaid on top of the filmed images, or Virtual Studio productions where a green backing color is replaced with a computer-generated studio or location.

The key element here is real-time. Whilst the combination of virtual content and video isn’t new, the real-time application of cinematic quality elements to live video is. That’s thanks to the development of game engine technology and tools like Pixotope enabling application of that tech to a live event setting. The resulting creative potential for broadcasters is limitless.

One of the areas where this has been most apparent is live sports events, where virtual production is being deployed with true wow factor. Seen as a way to engage new audiences, attract fans to stadiums, and visualize data and information, the use of virtual production techniques for live sporting events is gathering pace.

Live audience engagement

In the NFL, we’ve seen both the Baltimore Ravens and the Carolina Panthers bring supersized versions of their mascots to life. These AR elements respond to live in-game moments or elements of the physical stadium, and have been adapted up until the last second. These adaptations have allowed, for example, the giant raven to wait on the goalposts for the director’s cue, or the menacing big cat to believably pounce on the physical jumbotron. Virtual production here is attracting fans both old and new back into stadiums by adding a new dimension to ‘game day’ and heightening fandom. 

Engaging sponsorship activations

Even corporate sponsorship is being transformed via real-time virtual production. No longer confined to banners around the stadium, corporate partners are actually engaging crowds, in a way never before possible. Take Microsoft’s Halo game landing a spaceship at midfield during halftime at an Oregon Ducks game, or the Houston Texans producing a fan-driven racing game sponsored by retail company Kroger on a mixed-reality track.

Cost effectiveness and sustainability

Virtual studios demonstrate value not only in creative terms but in financial and environmentally conscious ones too. Virtual production means that all broadcasters can create immersive, hyperrealistic narratives without the need for huge budgets. Leveraging the power of a virtual set, users don’t require any specialist equipment or costly production stages. Spanish broadcaster VideoReport Canarias for example wanted to find a more resource-friendly and visually stimulating way of producing live TV. An entirely new show format was created for the daily primetime weather show ‘Una Hora Menos’. Virtual production saw a restrictive and costly production set transformed into a versatile and multi-camera operation that can effortlessly produce a limitless number of stage scenarios. The result facilitates reporting and creative storytelling without the need for expensive LED displays or constant (and expensive) changes to physical stage fixtures and furniture.

Being able to effectively land presenters in virtual environments brings creative and cost benefits to broadcasts, but it also negates the need for crew to travel there in person. There will always be instances where taking audiences to real life locations is valuable and a necessity, but in many cases it isn’t. Virtual production can allow broadcasts to be virtually located anywhere in the globe while being cost efficient and environmentally conscious. A sunset can last forever too!

Brand new formats

Formats that would have been impossible only a few years ago have left the realms of fantasy and taken center stage. ‘Alter Ego’, which aired on Fox, was the first AR talent show in the US. Contestants perform like never before — as their dream avatar. Precise real-time engineering enabled 20 unique AR characters, powered via full motion capture by their human counterparts, to perform in front of a panel of celebrity judges and a 200-strong audience. There’s no doubt that ambitious shows like ‘Alter Ego’ are technically complex to produce. Still, the broadcast and technology community are rising to the challenge, growing in expertise and creating the tools and workflows they need to entertain audiences in completely new ways.

Some believe that virtual production is the future of content production, enabling the creation of the most impactful and informative media in the world. The tools for broadcasters to create and deploy this content, whilst maximizing studio facilities and reducing costs, are now available. Though not without its challenges, virtual production has proven its power to win over audiences, shape new kinds of broadcasts and facilitate more efficient, adaptable and sustainable ways for us to work.

In the next article in this series, we’ll offer a quick guide that will give key pointers for those new to contemporary virtual production techniques. We will be sharing virtual production workflows and camera tracking — all you need to know to get started!

The post Column: 6 reasons to consider using virtual production appeared first on NewscastStudio.

]]>
111311
Broadcast Exchange: The merging of film and broadcast storytelling tech https://www.newscaststudio.com/2022/01/25/merging-of-film-and-broadcast-technology/ Tue, 25 Jan 2022 12:00:04 +0000 https://www.newscaststudio.com/?p=107414 From virtual fans in MLB ballparks for Fox Sports to the unique particle animation open at the Super ... Read More

The post Broadcast Exchange: The merging of film and broadcast storytelling tech appeared first on NewscastStudio.

]]>
From virtual fans in MLB ballparks for Fox Sports to the unique particle animation open at the Super Bowl and even Fox’s digital avatars on “Alter Ego,” Silver Spoon is at the forefront of live virtual production.

Silver Spoon brings visual effects tools often found in film production, such as motion capture and real-time animation, to broadcast.

Laura Herzing, executive producer at Silver Spoon, joins the Broadcast Exchange to talk about the merging of film and TV production techniques, the rise of extended reality storytelling and the industry-wide impact of the Unreal Engine.

She also gives us an update on some recent projects and an upcoming launch at Silver Spoon. 

Listen here or on your favorite platform below.

 


Listen and Subscribe

Video: YouTube

Audio: Apple Podcasts | Spotify | TuneInPocket Casts | Amazon Music

Show Links


Transcript

The below transcript appears in an unedited format.

Dak: Welcome to the Broadcast Exchange from NewscastStudio. I’m your host Dak Dillon on the Exchange. We talk with those leading the future broadcast design technology and content. Today I’m joined by Laura Herzing of Silver Spoon.

Silver Spoon is at the forefront of design and technology, creating unique extended reality, and augmented reality experiences for CBS Sports, Fox and other broadcasters.

Dak: Thank you for joining me today. I suspect many listeners will not be familiar with Silver Spoon, but many will have seen your work. you have worked on some entertainment programs. You’ve done a lot recently in the sports industry. so help introduce us to Silver Spoon and what you all are.

Laura: So in a nutshell, Silver Spoon creates real-time augmented reality and extended reality (XR) content through Unreal Engine. We got our start about six or seven years ago as a motion capture studio that was primarily serving other VFX studios.

The commercial and entertainment industries, and through our work in motion capture, we became a very early adopter of Unreal and really quickly realized the power that this program had, not just for real-time preview, but for final pixel as well. So in those last few years, we’ve really been in before.

Heavily in our real-time pipeline, to build a pipeline for using Unreal and using real-time graphics in, in the final execution. so at this point now we’ve really expanded our services to offer everything from creative content development, integration, and operation for real time, AR and virtual production across a very broad range of industries, including Brooke.

Dak: Yeah, you brought up the Unreal Engine. It is crazy how much it has swallowed up the industry in the past two or three or four years where it was, oh, maybe one company used it and now suddenly it’s become the standard bearer. You know, how does it fit in to your overall growth and where do you see it going from here?

Laura: Um, Unreal Engine has been integral to the growth of our company. It’s, it’s really something that we. I saw so much potential in when we started using it for that limited use, we were really using it for real-time preview during motion capture. But as the program has become so much more advanced and offered easier integration.

Everything from, you know, realistic looking humans to visual effects, motion graphics, and really expanded the capabilities that the program can offer. I think we’ve really expanded the capabilities of the work that we’re doing right along with it. So I think the growth pattern of the Unreal Engine and Silver Spoon have been very parallel.

Dak: Are you already taking advantage of Unreal Engine’s MetaHumans?

Laura: We have. Yes. Yeah, I think real time realistic digital doubles is something that there’s just so much interest in. And being able to utilize a tool like MetaHumans to create. Generic digital doubles and more specifically like real-time humans that look like a specific person.

That’s something that we’ve just seen so much interest in and yeah, we’re, we’re actively arguing.

Dak: Yeah. I mean, you know, Keanu Reeves talking about with the matrix and, and some of these new advances, it’s, it’s crazy that in the future, you know, art is the anchor you’re watching on the nightly news real, are they synthetic or will we ever even know?

you know, In terms of the pipeline, Do you see anybody catching Unreal or have they just kind of run away with the game?

Laura: For our uses. I think Unreal is really the only option.

I know that unity is, is well utilized in some similar industries, but not specifically what we’re doing. I think that a lot more have a foothold in, in virtual reality. But I think for. Real-time broadcast AR for virtual production. We’re really seeing that Unreal is, is the clear leader.

Dak: So one of the interesting things that I’ve thought about with Silver Spoon is, you know, you start on the visual effects side, and then now you’ve kind of went into these other parts of broadcast and it’s creating this unique synergy that not a lot of other firms have where, you know, maybe they’re doing broadcast design work, or they’re doing.

Virtual work, but you’re really bringing a lot of different disciplines in terms of design together in a package.

how do you think that the industry is changing in terms of its thinking around these different disciplines and the way that they interact?

Laura: I think it’s, it’s becoming so much more widely accepted to utilize real-time production tools and virtual production tools.

I think before the pandemic, honestly, it was really something that I think that these tools. Relegated to, big budget films and big name, producers and production uh, broadcasters. And now it’s something that I think the pandemic kind of forced the hand on a lot of producers and broadcasters to try something different and to be open to different ways of creating their content and has made Unreal, more accessible and more.

Widely accepted. So I think if you think about like, You know, traditional motion design that was really limited to like a 2D graphic on screen. Now those graphics can be fully 3D. They can be AR and really feel like they’re living in the real world rather than just being applied on top of the screen.

If you think about visual effects that, you know, traditionally you shoot on green screen and then you’d be doing very heavy compositing and putting all of your visual effects and environments and in post. Using a virtual production pipeline. You can get so much more of that in camera. So just having so much more of these tools readily available, not just for the highest end productions, but for more broadcasters, even commercial production, we’re seeing huge interest in it.

Things where we’re, you know, maybe even just doing a one or two day shoot that can still utilize a lot of these technologies.

Dak: Has it taken a lot to convince your clients that it’s okay to do these in-camera and not necessarily have to rely as much on the post-production

Laura: I think yeah. With any new technology. I think there’s a little bit of a learning curve and a time of uncertainty and what, before it’s fully adopted whether that’s for the client or for the people on set the DPS who, who may be. Unsure of using this, this workflow or uncertain of how exactly it can benefit them. But we found that the more that we’re able to just show it and go through the process and, and have everybody kind of see how it works, then the acceptance and the adoption of it kind of comes naturally out of that.

I think it’s just kind of getting over that initial hurdle of this is different. We’re not sure how it’s going to work and finding the right clients who are willing to just. Let’s go for it, you know, and, and be an early adopter and try something that, that maybe they’re not a hundred percent familiar with.

Those are the types of projects that I think help push that acceptance for other brands and other broadcasters who maybe need to see that, okay, this has been done. Let’s try it on our stuff.

Dak: Yeah, you mentioned early adopters. So last year at the Super Bowl, you all obviously helped CBS Sports, do some motion capture and that’s something that’s not really been done at that level in terms of, national sports, it’s, it’s usually reserved for movies or something where it’s very specific, very calculated. And then in this case, you’re merging those different disciplines.

Laura: Yeah. Superbowl last year was, was an awesome project. I mean, we got to really elevate the visual design using AR graphics and real-time particle simulation and camera fly-throughs and things that just hadn’t been done before. And CBS has really been a. Awesome partner in working with us to do AR graphics, not only in their live broadcast in the, in-venue, but also bringing AR into their studio shows as well. You know, virtual sets are, are very commonplace now in sports broadcasting.

But last year we had the opportunity to work with CBS again on NCAA March Madness and actually bring these little animated real-time animated AR characters into the studio.

Just, just kind of add another dimension to their, to their studio broadcast for March Madness and yeah, those types of projects, you know, when you find the right clients and the right broadcasters who are willing to push the envelope and try something new, it can, it can be really great. And it can really add to the, the viewing experience.

I think our goal always with AR is to enhance and not to take away from the. Intent of the viewing experience. Are they the game that you’re watching?

Dak: Now many of these technologies, sports productions have usually led the way and brought them to the forefront first and foremost, before they then filter their way down to, news or some other types of broadcasts.

Laura: Yeah. Yeah. I think sports, you know, sports are well-funded sports are fast-paced. They’re broadcast has a relatively short lifespan. So I think that sports broadcast. Have always been tech-forward and are more inclined to be early adopters and try something new. And I think that because kind of, because of the nature of their broadcasts, I think they feel a little bit more empowered to, to try these things that other broadcasters or other platforms might see as risky

I think a great example of that is what we did during the pandemic with Fox sports, for their virtuals crowds. I mean, that was a completely unprecedented use of AR technology. Real-time camera tracking in live broadcast sports to meet like such a specific need that nobody could have ever seen coming, but when it was here and, and we had that opportunity in front of us and really that challenge in front of us Fox was the network that stepped up and.

Yeah, let’s try this. Let’s let’s see how it works. And that’s led to innovation and in a lot of other areas of AR broadcasting and sports broadcasting I think that really just kind of broke the ice on the possibilities for a lot of different areas.

Dak: So what does something like the virtual fans? Where, where does that lead to?

Laura: I mean, I think the possibilities are endless. I directly out of that you know, we’ve worked in, in sports. I, I already talked about the, the super bowl and March madness. I think both of those projects were kind of a direct correlation to what we did with Fox and the AR crowds.

That also kind of. Specifically a Silver Spoon into the entertainment industry and doing studio shows like Fox’s “Alter Ego,” which just finished airing. It was the first real-time AR singing competition. So all of the characters are real-time animated. There’s real people backstage driving them through live motion capture, and then they’re composited onto the stage in real time.

It’s all happening in camera. And that. Is a natural progression of where we started with the AR crowds taking it to a much more artistic execution. But still all the, all the principles of how we did it are the same. It’s just you having to take it to the next level.

Dak: So talking about “Alter Ego” a little bit more specifically, you know, tell us a little bit about what goes into that production to allow that to happen each week. You know, ’cause it’s, it’s not like “The Masked Singer” where they just have to wear a costume.

Laura: I mean, in a, in a workflow like this pre-production is so important. Because like you said, it’s not just putting on a costume you’re, you’re creating these custom avatars. Costumes that they’re wearing need to be created ahead of time.

So what you actually saw air was just a small fraction of the boardroom that we had built for these avatars, because you don’t know from week to week, which one of them is going to advance. So planning and pre-production is a huge part of it. And then having time onsite to really integrate systems is I think just so important, especially since AR motion capture.

These things are not part of a typical broadcast pipeline, right? So in fully integrating into a setup for broadcast TV show where we’re moving very quickly the shoot days are close together. There’s not a lot of time to change things on the fly. Having the workflows kind of worked out and prepared ahead of time, allowed us to meet that demand and keep up with that rigorous.

Reality show shooting schedule. And yeah, it was important for everybody to just kind of be on the same page with this, from the start, and then also for Fox as a network to be open to adding this whole new dimension to their workflow and, and working through, you know, the stumbling blocks that are naturally there when you’re putting something completely new in your package.

Dak: So for a show like that, or you pre pre-baked the textures in for these avatars or is this something where it’s, it’s pulling, picking in the lighting from the environment they’re in how’s that.

Laura: The textures are, are baked in. So, you know, we, we had a whole team of four drove artists that were creating the clothing that the avatars are wearing.

So those, the avatars that are in engine are fully textured and then the lighting is happening in real time. So as we were working with Lulu AR who was the vendor who handled the AR composite, they were, they had a DMX operator. You know, controlling the virtual lighting to match the real world lighting and all of that is happening in real time.

So for every performance every year in your interview, the lighting scenario is unique and that’s being controlled by, by a person, by an operator who’s making the virtual world fully realistically integrate with the practical world that we’re seeing on stage.

Dak: I had another interview recently talking about one of the studios for the Olympics and they brought up that they think in the future, there will be a kind of a virtual lighting designer assigned to all of these types of projects, because you know, in the past the scenic designer, whoever was modeling, it would just kind of decide what the lighting was gonna look like.

But now. Cohesion and that realism, you have to kind of layer it in so much.

Laura: Yeah. A hundred percent. Yeah. I think that that is an essential role. If you’re working with prac with a mixture of practical and virtual light, Just like you need a lighting designer to do the practical lighting on your set.

You need a lighting designer to do your virtual lighting and ed to make sure that those two systems are working in tandem and are cohesive so that the, the AR whatever the AR is that you’re trying to put into the real world looks naturally integrated. Cause you know, otherwise you’re gonna end up getting a result.

The eye catches and that breaks the illusion, unless it, unless it is very closely.

Dak: Where do you think this all takes us? You know, it’s been a very fast progression, now to see it also in the entertainment sphere on these kinds of day turn shows that are, are quick, turn shows you, where does this lead us to?

How does this impact future storytelling?

Laura: think you’ll see a lot more of it. Reality, I think you’ll see a lot more of it in scripted television. You know, the idea of kind of having your real life, your in-person life and your metaverse life or your online life, or, you know, that that’s an idea that’s becoming so commonplace and so mainstream, and, and I think that we’ll see a lot more of that type of storytelling in in narrative.

Films and movies. So I think that the production of that lends itself very well to what we’re doing. We’re, we’re basically creating that metaverse in that, on that, on line persona. So yeah, I think that, that, that’s definitely something you’ll see. I think too, like you were saying at the top of the call, you know, having virtual.

People and avatars in places where you might expect that it would be a human whether that’s a host or a newscaster. I mean, we’ve seen a bunch of virtual influencers already, you know, take hold and really build fan bases. So I think we’re not that far away from that where we have either realistic or stylized, you know, we have the avatars who are kind of coming into our lives in and are accepted in ways.

Humans would have been, or, or, you know, are in those roles previous.

Dak: Yeah. I mean, It’s a topic for a different day, but obviously when you start synthesizing real people, you know, it raises a lot of ethical things, but that’s, you know, that’s, that’s for the journalism folks to figure out, not in this.

Laura: Just say we’re talking about stylized avatars only. We’re not talking deep fakes

Dak: what obstacles do you see right now that are still there? Is it compute power? Is it the fact that you can’t get an Nvidia engine?

Laura: I mean, yeah, the very tactically supply chain issues are a challenge. It’s difficult to get some of the hardware that we need or that anybody would need to do this.

But now I think, you know, those are, those are solvable problems. And I think that those are things that will resolve themselves. I think the. The biggest stumbling block that I would see is just acceptance. And I think that those, those walls are starting to come down. I think the more successful uses of real-time technology that you see that become more prominent.

I think it’s just going to be a cascade of. Interest not just in broadcast, but I think, you know, there’s so many different industries stage and theater, experiential installations, museums music, videos. I think there’s just so many different ways that this technology is used and can continue to be used that once that challenge of accepted acceptance is overcome, it’ll just be you know, the flood gates will open.

Dak: So, where are you looking for inspiration today? Prepare these future experiences.

Laura: That’s a good question. I think it’s variation can really be found everywhere. I mean, seeing what other people are doing with the technology. I love watching just trying to watch TV or videos or movies and, and figure out how each shot was made.

I think anybody. Is in production or isn’t in, in the industry, probably struggles with that. Or you get a little caught up in the details. So I think just, you know, looking around and taking inspiration from what other folks are doing. It is always a great place to start and finding personally, sometimes I kind of just like to walk away from it too.

You know, I, I find that sometimes the best ideas come when I’m like out for a walk or cooking or doing something completely unrelated to screens and real-time content and the industry. But giving your mind and yourself the space to kind of absorb everything that, that you’re taking in all day.

That comes you come back to your work with a new purpose and with a fresh mind that new ideas can kind of emerge when you step away from it, especially now more than ever, you know, I think we all need to get outside a little bit more.

Dak: What’s next for Silver Spoon? Where are you all headed?

Laura: We are. Really looking into the future. I mean, we we’ve had such tremendous growth over the last couple of years. And like I’ve been saying there’s just so many different industries where I think real-time content and these technologies are applicable, but specifically.

We are just about to open our XR stage in Nyack, New York.

So we are building a volume of our own taking all of our learnings and virtual production and, and creating a space where we can shoot and continue to R and D projects there. We’re also really getting focused in, like I mentioned earlier, large-scale installations. So whether that’s projection mapping or using LEDs taking real-time content and making it, taking it out of the screen and taking it into real life in venues and experiential spaces where people can interact with things in real time and, and kind of get that person.

Experience that we’ve had, we’ve seen on screen, but now we want to bring it into, into real life and hopefully yes. Being able to do things and in real life and in person becomes more open and available to us. Hopefully we keep saying that, but I think we’ll be right there and ready for it. When that, when that time comes.

And then, yeah, just continuing to expand our footprint in broadcast entertainment films. We’ve had some really awesome partners in both sports and entertainment world that have accepted, you know, the, the challenges of doing things for the first time, but also rate some of the benefits of being early adopters and able to try things for the first time.

So I think that, again, as we get over that hurdle of acceptance, those industries are just wide open for us.

Dak: On the live event side, there’s obviously going to be a lot of pent-up demand that will eventually come cascading back to a lot of these type of been used and, and settings.

It’s just a question of when, you know, in the UK, they said we have six more years ahead of us. So hopefully that is not the case of pandemic world, but well, thank you so much for taking some time out of your day to talk about where you all are headed and what you’re up to, you know, we’ll be on the lookout for more.

Laura: Awesome. Thank you so much.

Dak: Thanks for joining us today on the Broadcast Exchange.

The post Broadcast Exchange: The merging of film and broadcast storytelling tech appeared first on NewscastStudio.

]]>
107414
How ESTV uses augmented reality to engage streaming viewers https://www.newscaststudio.com/2022/01/19/augmented-reality-esports-arti/ Wed, 19 Jan 2022 14:21:35 +0000 https://www.newscaststudio.com/?p=106895 Much of the content ESTV streams is already virtual — given that it covers esports ... Read More

The post How ESTV uses augmented reality to engage streaming viewers appeared first on NewscastStudio.

]]>

Partner Content

What is partner content?


Much of the content ESTV streams is already virtual — given that it covers esports — but the company quickly realized it needed to enhance the look of when its program creators are on screen, as well.

Augmented reality (AR) was the logical choice, not only because it puts it in line with “big league” broadcasters, but it also brings the look and feel that its audience is used to seeing in the gaming experience alongside program hosts.

“There are so many different content creators around the world. Whether it’s YouTube, Twitch or everybody else, they’re all just streaming and talking and playing games on their consoles or PC,” noted Eric Yoon, CEO and founder of ESTV.

“But if you can embed a couple of unique AR images or provide visibility into the gameplay while maintaining a connection with both the audience and user, that’s where you have the engagement coming from. So it takes you from square one to the 10th floor,” said Yoon.

Ultimate Gaming League (UGL) show on ESTV displaying augmented reality logo.

This is especially true in an online world like YouTube where users scroll through thumbnails of dozens upon dozens of content options. In these cases, having AR elements pop up in the previews can really help the content stand out, grab attention, and make it more likely that a viewer will explore your feed.

Yoon notes that ESTV hosts are from around the world and often broadcast from living and bedrooms and therefore don’t have access to full studio setups.

Arti’s cloud-based AR platform provides ESTV the flexibility and tools they need to enhance their storytelling with AR, just like traditional sports broadcasters.

Because Arti’s AR solution runs on consumer-grade smartphones, tablets and laptops and doesn’t require any specialized camera hardware or tracking system, it was the right choice for enabling ESTV hosts to bring AR elements into their content.

Arti also doesn’t require a team of people back in a control room or production truck, so it’s ideal for solo video content creators. A presenter can pull up a full augmented reality experience as they need it and directly from devices they have on-hand.

This capability is key for giving ESTV’s program creators the power to illustrate what they are talking about by showing it on screen instead of just describing it.

“Traditional linear TV represents a one-way experience — just basically shooting content for a viewer,” said Yoon. “If you fail to interact with your user and a viewer, you’re not going to get to the next level.”

The world of esports is, like traditional sports, heavily data-driven, so displaying stats in the form of AR charts and graphs is a great opportunity for hosts to show number breakdowns rather than reading out long lists of numbers. All of these can be built in a matter of minutes with Arti, thanks to built-in templating tools.

“It’s very easy to use and only requires a mobile app,” noted Yoon. The company has worked with Arti to deploy the technology to hosts in multiple locations and the short learning curve makes it easier for them to dive in and get started, particularly given that many of them are already savvy with apps and navigating 3D environments.

An ESTV host gestures to a sponsors’s website displayed in AR.

Given that most program hosts only have single-camera setups with minimal professional grade lighting, it was also important for the network to find a way to display screen content alongside hosts.

The solution? Floating AR “screens” created with Arti that appear over the host’s shoulder and provide them the opportunity to discuss specific players, characters or other gameplay elements while still remaining on screen and just gesturing to the virtual elements.

Similarly, Arti and augmented reality are also starting to play a key role in revenue generation, allowing program hosts to pull up website screenshots, product photos and sponsor logos, all while remaining on screen to boost the personal connection with viewers, making host-read sponsorships on podcasts and similar content so effective.

Looking toward the future, ESTV and Arti are already exploring ways to enhance “first person” events such as drone racing and cycling events that blend real and virtual competitors with AR.

Arti also sees its technology as a way for established sports operations to enhance their multiplatform content, particularly on social media.

“The reality is that you’re increasingly seeing broadcasters and sports operations filling their social feeds and YouTube channels with unique video content produced by individuals that are acting more like content creators than traditional broadcast,” said Avner Vilan, co-founder and CEO of Arti. “It’s getting harder and harder to draw the line and tell the difference between them.”

Creating content using tools such as Arti’s streamlines this process, allowing organizations of all sizes to create even more engaging video content using AR.
 

The above column is sponsor-generated content from Arti. To learn more about sponsor-generated content, click here.

The post How ESTV uses augmented reality to engage streaming viewers appeared first on NewscastStudio.

]]>
106895
NFL returns to Nickelodeon with slime-filled NFL Super Wild Card game https://www.newscaststudio.com/2022/01/11/nfl-returns-to-nickelodeon-with-slime-filled-nfl-super-wild-card-game/ Tue, 11 Jan 2022 14:01:07 +0000 https://www.newscaststudio.com/?p=106983 Following last year’s successful first outing, CBS Sports is again bringing an NFL playoff game ... Read More

The post NFL returns to Nickelodeon with slime-filled NFL Super Wild Card game appeared first on NewscastStudio.

]]>
Following last year’s successful first outing, CBS Sports is again bringing an NFL playoff game to Nickelodeon.

This weekend’s game between the San Francisco 49ers and Dallas Cowboys will feature the upgraded effects that give the game a unique, kid-friendly look.

Augmented reality overlays including a faux-blimp will appear throughout the game along with unique graphics on the field that will “interact” with players after key moments.

The game is one of six Super Wild Card matchups between Saturday and Monday.

CBS Sports’ play-by-play announcer Noah Eagle, former NFL player and “CBS Mornings” co-host Nate Burleson and Nick’s Gabrielle Nevaeh Green will call the game for the second consecutive year. Young Dylan makes his debut as the game’s sideline reporter.

The game on Nickelodeon is produced by CBS Sports in association with Nickelodeon Productions.

The post NFL returns to Nickelodeon with slime-filled NFL Super Wild Card game appeared first on NewscastStudio.

]]>
106983
Cloud production opens up new frontiers for broadcast graphics https://www.newscaststudio.com/2021/08/11/cloud-production-opens-up-new-frontiers-for-broadcast-graphics/ Thu, 12 Aug 2021 04:00:00 +0000 https://www.newscaststudio.com/?p=102473 Broadcast is about creating amazing content, not worrying about server racks, software upgrades and rendering ... Read More

The post Cloud production opens up new frontiers for broadcast graphics appeared first on NewscastStudio.

]]>

Partner Content

What is partner content?

Broadcast is about creating amazing content, not worrying about server racks, software upgrades and rendering bottlenecks. That’s where the cloud comes in. 

The cloud opens up new workflows for broadcast production, allowing tasks that once required specialized equipment to become a seamless part of the newsgathering and transmission processes. 

“The cloud is not important by itself,” said Yaron Zakai-Or, CEO of Arti, a cloud-based AR platform. The question is, how do you make broadcaster’s lives, and their viewers lives better through the cloud? Broadcasters shine with content. The cloud is important because it allows broadcasters to streamline production and get the same level of service, or even better, so they can focus on great storytelling and creating amazing content for their viewers–rather than the mechanics of delivering that content.”

Three flavors of the cloud

Today, most broadcast facilities rely on on-premise servers or one or multiple versions of “the cloud” to produce content, namely: private, hybrid, and public cloud. 

The public cloud comprises cloud service providers like Amazon Web Services, Oracle Cloud, Google Cloud, and Microsoft Azure, which allow you to dynamically scale your virtual computing power and space on-the-fly. With on-premise, servers are housed at the broadcaster, requiring space, redundancy and upkeep. Private cloud provides additional control and customization vs. public, but therefore requires the same IT staffing, management, and maintenance as an on-prem datacenter. Hybrid, meanwhile, is a mix of both on-premise and public cloud. 

Up until recently, with motion graphics creation and playout specifically, it’s been almost entirely on-premise, requiring specialized server racks that can render elements in real-time for air. 

“If you think about traditional 3D motion graphic systems, then you need to have strong servers. You need to have graphic cards from a specific brand. You need to have software that’s installed and maintained on an ongoing basis,” said Zakai-Or. “The challenge many independent broadcasters and others face with those systems is: How do you build redundancy for those servers? How do you build an infrastructure with scalability? It typically requires a big IT ops team with specialized know-how.”

These systems often come with complex, multi-year licenses and require regular upgrades, while not allowing for scalability.

“Once you focus on software-first solutions that run on the public cloud, all of those problems are eliminated because you don’t have any servers to worry about. Someone else has that expertise and they manage that for you,” said Zakai-Or.

Also on the staffing side, broadcast graphics systems typically require specialized staff for creation. Otherwise, when a producer can make a graphic, it’s often locked to particular templates with limited creative freedom.

How can the cloud empower broadcasters to tell better stories?

Cloud-based services have brought capabilities once reserved for special events, like a major sporting event or election, to everyday production. Graphics like augmented reality can now be inserted into any story, from an update on home prices, to the weekend weather outlook. 

With a lower barrier to entry, cloud-based graphic creation aims to be as simple as Google Slides, allowing producers and journalists to visualize a complex story quickly. This democratization of toolsets allows broadcasters to enhance storytelling with a solution that any staffer in the production pipeline can use. 

And by eliminating the need for on-site hardware, cloud graphics solutions deliver new flexibility for graphics-rich, live field reporting. 

The power of software-first augmented reality

The one hang-up in moving towards a software-first workflow has been camera tracking, which helps orient 3D graphics in physical space. 

Traditionally, this has required a system attached to a studio camera that would map the space and coordinate where to render graphics, requiring yet another server rack. 

When you think software-first, however, you rethink how camera tracking can interface with the cloud.

With Arti, for example, a paper marker similar to a QR code is used to orient space and align the augmented reality graphics. 

This interface allows for some impressive results, such as on this summer’s SRX Racing series on CBS Sports. On the broadcast, augmented reality was used to create real-time telemetry dashboards inside of the race cars moving 100+ miles per hour. No specialized camera tracking is needed. 

“We are 100% software-based. So, that gives us the ability to take the camera feed in the cloud,” said Zakai-Or. 

Without the need of a true connection to the hardware camera, the options are limitless, allowing connections like RTMP or NDI to utilize cloud graphics.

The cloud, however, is still in its early days for broadcasters, with many relying on a mix of on-premise, hybrid and public cloud solutions.

The key, of course, is to create great content. With solutions like Arti’s cloud-based AR platform, broadcasters can explore software-first graphics creation with a low barrier to entry, without requiring specialized training or staffing.  

Thanks to cloud technology, with Arti, broadcasters can be up and running in 30 minutes, bringing augmented reality to any broadcast.

 

The above column is sponsor-generated content from Arti. To learn more about sponsor-generated content, click here.

The post Cloud production opens up new frontiers for broadcast graphics appeared first on NewscastStudio.

]]>
102473
Mo-Sys and Hyper Studios partner on sports-focused solution https://www.newscaststudio.com/2021/03/14/mo-sys-and-hyper-studios-partner-on-sports-focused-solution/ Sun, 14 Mar 2021 22:47:53 +0000 https://www.newscaststudio.com/?p=98859 Mo-Sys Engineering and Hyper Studios have partnered on a production system featuring data-driven sports graphics, ... Read More

The post Mo-Sys and Hyper Studios partner on sports-focused solution appeared first on NewscastStudio.

]]>
Mo-Sys Engineering and Hyper Studios have partnered on a production system featuring data-driven sports graphics, branded as StarTracker Sports Studio.

The solution includes a complete production system for sports broadcasting with a virtual studio rendered in Unreel Engine, AR graphics and HTML5 sports graphics.

“We have engaged with many sports broadcasters in order to capture the full spectrum of graphics functionality and workflows that they need now and going forwards, in order to cover major events,” said Michael Geissler, CEO of Mo-Sys.

“Working with Hyper Studios, we have designed a system that allows a sports broadcaster to create integrated and in-context virtual graphics content, and as a result produce more engaging and differentiated content for their viewers.”

In a release, Mo-Sys notes many broadcasters typically use two systems when using data-driven graphics in virtual production. This solution simplifies that process and enables the in-context design of graphics.

The post Mo-Sys and Hyper Studios partner on sports-focused solution appeared first on NewscastStudio.

]]>
98859
Easy, flexible cloud-based AR for broadcasters https://www.newscaststudio.com/2021/03/10/easy-flexible-cloud-based-ar-for-broadcasters/ Wed, 10 Mar 2021 20:48:20 +0000 https://www.newscaststudio.com/?p=98724 Like so many broadcasters, eNCA has recognized the value of enhancing its data reporting and ... Read More

The post Easy, flexible cloud-based AR for broadcasters appeared first on NewscastStudio.

]]>

Partner Content

What is partner content?

Like so many broadcasters, eNCA has recognized the value of enhancing its data reporting and visualization with augmented reality — but is able to create and air segments without specialized equipment or devoting hours to design and development.

eNCA, which is an English language channel in South Africa, is leveraging cloud-based augmented reality technology from Arti to streamline its AR workflow and get stories on air faster and more efficiently — and live.

“It really allows us to produce AR stories really quickly, on a day-to-day basis rather than having to spend days and weeks preparing something to go to air,” said Michael Marillier, a data reporter and feature producer for eNCA.

“I know a lot of international broadcasters will only do AR when it’s prerecorded,” he added. “There’s still a lot of fear and a lot of apprehension around going live with AR and we, as eNCA, just try to embrace it as much as possible.”

Marillier himself is directly responsible for authoring the mixed reality segments and doesn’t have to lean on a typical team of designers.

This lets him respond to the news cycle and quickly prepare an immersive segment that showcases relevant data using engaging AR graphics.

Making all of this possible is Arti’s cloud-based AR platform that lets content creators leverage the power of data graphics and augmented reality from desktop, tablet and mobile devices, without the need for any costly in-studio hardware.

Example of Arti AR Platform where 3D objects can be prepared for use on-air.

The platform lets him pull in 3D objects from most standardized sources, while also providing built-in tools to generate 3D graphs and charts from a variety of data sources, almost like a Google Slides or PowerPoint for AR. The platform also supports integrating with real-time data and live social media feeds.

When crafting a segment, Marillier is able to manage everything from camera placement to timelines prior to air.

One of Arti’s biggest advantages, however, is that this all works without any additional equipment — there’s no tracking hardware, special rigs or special computer hardware to buy.

Talent video feeds can be from virtually any source, including consumer-grade mobile devices. This also opens the possibility for AR segments to be produced from nearly any location — including everyday offices and even home environments or outdoors.

The Arti platform has also been even more extensible than eNCA and Marillier originally thought.

Initially, he pictured having to tote a laptop into the studio and needing to jump back and forth to tweak framing and object positioning — but after using the system live for the first time, they’ve switched to doing everything from their phones, including scaling, positioning and adapting pitch.

Arti has provided invaluable support and has a clear focus on making the platform easy to use, said Marillier.

Despite its simplicity, Arti works just as well for broadcasters with more advanced facilities.

In fact, Marillier has begun combining augmented reality with real views of the eNCA studio sets, including its video wall — which holds the distinction of being the largest in South Africa.

Video walls, for example, tend to be better for presenting hard data, or “number stories,” while augmented reality elements can highlight visual points made in the story.

“What we’ve actually been doing the last week or two is trying to integrate the two,” Marillier explained.

For example, a recent segment examining location data collected by Google during the coronavirus-related lockdowns in South Africa was produced in front of the video wall, which was able to both “set the scene” with background imagery, as well as showcase more traditional charts.

Meanwhile, Arti’s 3D bar charting capabilities allowed an animated graphic to appear “next to” Marillier, while a 3D lock and key and car element also “sat” next to him to illustrate the concept of residents staying at home versus being out and about.

Not only does the combination of Arti and video wall technology help present information in the best way possible, but it also gives eNCA the flexibility to easily change up visuals to keep viewers engaged.

“You don’t want to go on air and have viewers say ‘Oh, okay, it’s the graphics guy again, and he’s going to be standing in front of a wall’,” Marrillier said.

Example of the graphic library in Arti’s AR Platform.

Instead, his goal is to surprise viewers by showing them “this week my combination of tools is a little bit different than it was last week.”

Arti also gives eNCA the opportunity to grow even more — and Marillier sees the opportunity to start producing segments outdoors as something he’s excited to try out in the future.

“I don’t think viewers would really expect a reporter to be coming from a live scene and be able to bring up a whole lot of stats that are relevant to what’s going on at the live scene,” he noted while pondering the possibilities for live field reports with mixed reality blended in.

Viewers already started to take notice, too, especially on social media.

When the video gets shared, Marillier has noticed one of the most common responses is, “How did you do it?”

“The tech seems a little bit like magic to some viewers. So I think when viewers are asking that, then you’re doing something right, because you’re giving them something new, something they haven’t seen before.”

Learn more about Arti and how you can leverage its augmented reality platform for your organization today.

 

The above column is sponsor-generated content from Arti. To learn more about sponsor-generated content, click here.

The post Easy, flexible cloud-based AR for broadcasters appeared first on NewscastStudio.

]]>
98724
Super Bowl LV on CBS to feature 120 cameras, AR graphics https://www.newscaststudio.com/2021/01/29/super-bowl-lv-on-cbs-to-feature-120-cameras-ar-graphics/ Fri, 29 Jan 2021 11:20:38 +0000 https://www.newscaststudio.com/?p=98166 Super Bowl LV, Feb. 7 in Tampa, will feature over 120 different camera angles on ... Read More

The post Super Bowl LV on CBS to feature 120 cameras, AR graphics appeared first on NewscastStudio.

]]>
Super Bowl LV, Feb. 7 in Tampa, will feature over 120 different camera angles on CBS along with a bevy of new specialized camera rigs.

The event will be the 21st Super Bowl for CBS Sports.

This year’s event unlike last year’s on Fox, however, will not be available in 4K due to production limitations brought about by the COVID-19 pandemic.

The overall production will also continue the remote production model CBS Sports has used all season long for its NFL coverage, with replay operators working from home while other editors, graphics operators and show production personnel at the CBS Broadcast Center in New York City.

Features of the broadcast include:

Augmented Reality – This year’s graphics package, which is expected to debut the new CBS Sports logo on-air, will unify the image of CBS with the motif of Super Bowl LV of sea and sand by using four augmented cameras. Animations, in-game graphics and augmented reality will continue to enhance the viewer experience. Utilizing the power of the Unreal Engine, embers and particles will light up the Tampa night during America’s most-watched event with detail never before seen on sports television.

Trolley Cam –  The Trolley Cam can speed from one end of the stadium to the other, ziplining along a wire and positioned to provide the viewing angle of a fan in the eighth row of the stands. The rig can travel up to 65 mph and will provide a look at the players from a vantage point not used in the history of the Super Bowl.

Venice Cameras – CBS will use several on-field cameras to capture a dramatic cinematographic feel. Two Sony Venice cameras will be used for the first time ever live at a Super Bowl.  The Venice, which is normally used for cinema-style applications like commercials and movies, will use full-frame imager shot in a short depth feel to give it that unique look.  The cameras, provided by Inertia Unlimited, will be operated on a traditional Steadicam rig, as well as a MOVI rig to capture the flare of the action in what has been described as a 3D video game look.

Movie Bird Crane – A 53-foot MovieBird crane, traditionally reserved for major motion pictures and television productions will be located on the upper concourse to give shots of the ‘Super Bowl Today’ pregame set, game action as well as serving as one of the many augmented reality encoded cameras that will be strategically placed throughout the stadium.

New Angles – Twelve cameras with 4K and 8K capabilities will be scattered throughout the stadium allowing the production team to extract close-up shots in the key moments of the game. The 4K cameras will be controlled robotically from high up in the stadium concourse levels while two Sony 8K cameras will be fixed on robotic gimbals slung to the stadium lower field. While past Super Bowls deployed 8K camera technology high up in the stadium infrastructure, CBS will debut this angle from near-field height for a unique view of the field.

By the numbers:

  • 120 cameras total
  • 18 robotic cameras
  • 32 cameras embedded in endzone pylons
  • 2 wireless pylon cameras along sidelines
  • 19 television mobile units
  • 3 Sky Cam / Fly Cams
  • 1 Trolley Cam
  • 1 Movie Bird camera
  • 25 super slow motion angles
  • 12 cameras with 4K and 8K production capabilities

CBS has experimented with other unique coverage during the 2020 NFL season including a broadcast on Nickelodeon.

The post Super Bowl LV on CBS to feature 120 cameras, AR graphics appeared first on NewscastStudio.

]]>
98166
Portugal’s Eleven Sports updates virtual set for Champions League coverage https://www.newscaststudio.com/2020/08/26/portugals-eleven-sports-adds-virtual-set-for-champions-league-coverage/ Wed, 26 Aug 2020 07:26:18 +0000 https://www.newscaststudio.com/?p=94847 Portugal’s Eleven Sports has updated its UEFA Champions League design and presentation. The network, part ... Read More

The post Portugal’s Eleven Sports updates virtual set for Champions League coverage appeared first on NewscastStudio.

]]>
Portugal’s Eleven Sports has updated its UEFA Champions League design and presentation.

The network, part of a multinational sports television group, debuted the new look on August 3, 2020, with the help of wTVision. The change coincided with a part larger network rebranding and design update.

The network’s overall insert graphics were adapted to fit the unique needs of the Champions League coverage and to layer in new augmented reality elements while also incorporating real-time data.

wTVision’s Studio CG platform drives the graphics with rendering from the company’s R3 Space Engine. The virtual set was also refreshed with taping occurring in Lisbon at the facility of wTVision.

The post Portugal’s Eleven Sports updates virtual set for Champions League coverage appeared first on NewscastStudio.

]]>
94847
Q&A: Disguise sees immersive presentations at forefront of home engagement https://www.newscaststudio.com/2020/08/12/disguise-view-on-the-state-of-the-broadcast-world/ Wed, 12 Aug 2020 12:10:16 +0000 https://www.newscaststudio.com/?p=94452 Disguise’s Tom Rockhill recently shared with NewscastStudio his thoughts on the state of the broadcast ... Read More

The post Q&A: Disguise sees immersive presentations at forefront of home engagement appeared first on NewscastStudio.

]]>
Disguise’s Tom Rockhill recently shared with NewscastStudio his thoughts on the state of the broadcast industry and how augmented and immersive reality will help drive its future.

What was the state of the broadcast industry before the coronavirus crisis commenced?

Prior to the outbreak, there was, and still is, a seemingly limitless capability to what can be achieved in broadcast thanks to advances in technology.

Through the outbreak, this obviously has seen a standstill to a big part of the live events industry, but that has not stopped our resilient industry in seeking out new and innovative solutions that go above and beyond what was previously thought possible, and capable with broadcasting, using elements of AR, MR and xR (Extended Reality).

Creating new levels of engagement is a top priority for a huge number of providers and broadcasting live events, and there are prime examples of this prior to the epidemic including the Opening Ceremony of the Pakistan Super League, which broke a world record for the longest AR broadcast in history, and was viewed by more than 50 million people around the globe, reflecting the popularity of this next step of fan and viewer engagement picking up pace within broadcast.

How have you seen the coronavirus pandemic affect the broadcast industry?

Broadcasted live events are arguably among the worst-hit industries, and travel restrictions and strict social distancing measures have meant the cancellation of global sporting events, festivals and televised award ceremonies.

This really has thrown a wrench in the hundreds of hours of planned programming for the foreseeable and has left broadcasters in a very difficult position.

With there being a big question mark around the duration of travel restrictions and lockdown measures being in place, this, of course, has also had an effect on the advertising revenue of broadcasters, leaving them with no choice but to seek ways of generating engaging programming and navigating smaller budgets, whilst continuing to cover this ever-developing story.

How has Disguise been navigating the effects of coronavirus? Have you done anything to help broadcasters during this time?

We recognize that it is an extremely hard time for our industry at the moment. We listened to demands and launched a series of initiatives that are supporting our community of freelancers, partners and customers across the globe. This saw the release of our latest software update, r17.2, which enhances remote working, and includes additional tools for creatives in the broadcast world to devise new concepts to connect with audiences at home, or online with Augmented Reality and 360 video.

In addition, we launched the Disguise OnDemand hub, offering free and unlimited access to virtual training workshops in multiple languages on a daily basis. We are also offering a free six-month license to our popular ‘designer’ software, used to visualize, design, and sequence projects, from concept through to showtime.

Disguise technology has been at the forefront of some groundbreaking projects in broadcasting previously, has there been a demand for more projects like this during the pandemic, or are broadcasters playing it safe?

We are definitely seeing a peak in interest for technologies capable of creating at home engagement, and reaching remote audiences with elements of AR and xR.

With millions of people confined largely to their homes and consumption of various types of content at an all-time high, this is truly a perfect time for broadcasters to try their hand with experiential technology.

Innovations like Disguise’s xR (Extended Reality) workflow shift the production process by allowing the creation of fully immersive experiences and advanced technical opportunities due to developing green screen technology. xR’s virtual set extension places presenters or performers in expansive environments significantly bigger than the available studio environment – opening up endless opportunities, especially in a time when staffing, as well as space, is restricted.

A great example of this during lockdown is the first-of-its-kind performance by popstar Katy Perry on the American Idol virtual season finale. xR technology immersed the singer in a completely virtual world that reacted around her as she performed her new single in a simple LED studio set in Los Angeles.

Change paves the path for innovation, and by utilizing experiential technology like xR, more compelling broadcasts are delivered. These new technologies will prove invaluable to many industries through these times and in the future.

Any advice for broadcasters for the foreseeable future?

With millions of people being at home, and who may have to return to this lockdown lifestyle depending on how the situation evolves and should there be a second peak of the pandemic, broadcasters have to be more open to alternative ways of engaging audiences, to take home viewing experiences to the next level.

There are varying assumptions and opinions about how the world will look and behave after this crisis, and whether we can ever go back to ‘the way things were’.

But perhaps we should consider evolving into a better way of working – a more flexible, more technologically advanced, more immersive and engaging way of operating, that also future proofs broadcasting despite what arises. It is also significantly likely that creating immersive content, with experiential technology will play a much bigger part in the new ‘normal’ of broadcasting.

The post Q&A: Disguise sees immersive presentations at forefront of home engagement appeared first on NewscastStudio.

]]>
94452