NETINT Technologies about Solving the Multiview Streaming Problem - Voices of Video with Rob Koenen

NETINT Technologies about Solving the Multiview Streaming Problem - Voices of Video with Rob Koenen

Show Video

welcome to netense voices and video today's episode is about VR and multi-view a technology that lets your viewers choose among different available fees from the same or different events and create their own immersive viewing experience we're talking today with Rob Conan founder and chief business officer of child media which develops and supplies VR and multi-view Technologies to Publishers like Sky Sports BT Sports and lgu plus ifrop's name sounds familiar it should Rob has worked with codex going back to the start of h.264 when he founded and served as the president of the impeg industry forum he also founded and served as the first president of the VR industry forum and has initiated and guided work in mpeg-i the standard for immersive media in MPEG Rob you're a busy guy thanks for joining us pleasure thank you Joan for having us so why don't you walk us through your background real quickly you know take us through your your education take us through your job history and and end up in tall media very uh I don't want to make it too long but I started as a electric in electrical engineering and studied information Theory actually a little bit of expert systems uh things that are coming back now in the form of AI but uh and uh and my first real job was at KPM which is a Dutch in cumber Telco I'm doing a lot of video coding research basically image communication research I then moved to uh to Winter trust in the US Silicon Valley where I worked on on DRM systems then it was back to the Netherlands where I worked for tno which is a national largest national Research Institute also doing media research again and then in 2016 uh we tried uh to to we showed some of the technology we had developed into you know at IBC it was called tired streaming and we applied it to VR which was a hype at that point and uh long story short we decided to set up our own company spin it out of tno and uh and found it titled media to uh to to the provide tiled streaming services for virtual reality what's the high level vision for the company the vision is that we want to have the most advanced streaming in the world and uh we think we have the most advanced audio visual player in the world it's all founded on on tiled streaming which means that we see video as uh as as composed of a number of tiles that and we get to talk about this I hope a little bit but all of these Styles can make uh to can together make up a single video which could be a VR video of insane resolution or it could be uh the multiview that we will get to talk about where we compose uh what the customer sees of all sorts of uh tiny little elements that we we since this is a technical podcast we we pushed them all through a single a single decoder so tell us about VR tell us about multiview VR was a huge promise seven years ago it's still a promise uh it's not grown as fast as everybody would have hoped including us uh but it's it's it's healthy and alive and we've done amazing things in VR like we've done the Premier League with with uh with two customers we still do with Sky uh uh sky has a couple of Premier League football matches every week and it's amazing it's an amazing user experience but um VR video so VR is a lot of things we focus on VR video it's still I would say is nascent I think it's the bridge from making videos something for gamers to making uh VR for some to something for uh for a larger audience but it's it's slowly growing uh we're doing interesting projects how did that evolve into multi-view well it's by the same token we saw that BR wasn't growing as fast as we had always hoped uh and we saw the the basis of reptile streaming technology actually wasn't VR until we decided to reply to VR so we went back to the basics and said there's more that we can do with this and for one thing we we saw video consumption change we saw the need for uh increasing uh interactivity engagement and we decided to build a multi-view product which was about two years ago I think and uh it's it's now about ready it it has taken a bit of time but it's interesting and it was kind of prescient because if you see what's growing out now in the Market Apple is starting to do multiview uh YouTube is starting to do multi-view so it seems the timing is right why don't you show us what multiview is what it looks like yes I will uh need to do something that's slightly complicated which is uh move the camera to my iPad screen sharing doesn't work for this so I'll try to do it on a real iPad so this is streaming video uh you see one stream here and what we have here is a like we call it a 5 25 item multi-view uh well clip or experience the first thing you can see is that we've got all these moving thumbnails they're not Stills they're actually moving I can switch to any of these immediately and this is actually streaming right it's not a demo it streams from uh from our CDN account but I can do other interesting things is like drag in I can I can change by clicking in this thumbnail I'll change the main videos and what actually happens here is interesting also because it actually changes what is being retrieved from the network but I can go a little bit overboard I can add a thing so you could you could imagine that these are cameras in uh well in uh in a in a race or in a or in a go in a golf event um and this is kind of a busy background so let me give it a bit of a simpler background but there's a lot of flexibility here you can do a grid um you can just uh this could be uh like the channels you normally watch um this is yeah this is what I wanted to show you I think this gives an impression of what it does right the early adapters are they doing this for multiple views of a single event or are they doing that to enable uh you know different views for different events what's what's your what's the application so we're just rolling this out but the uh the most interest right now is from people that have uh multi-chemers so it's it's like there's these events it could be racing it could be like I mentioned golf it could be Athletics where there's stuff happening the director is forced to make a choice but there's stuff happening elsewhere you may have a favorite athlete you want to follow them you want to be able to switch to them but mostly also you want to be able to uh you you don't want to lose the story that the director tells you because they have amazing tools at their fingertips to tell tell stories but sometimes you just have your favorite athlete that you the director isn't following and you want to keep track of them and this is where the technology shines uh the other example where we see this deployed by other parties is like uh NFL uh it's called Uh game day I think uh ticket a Sunday Ticket where uh where with YouTube video you can get like four matches in in a certain fixed configuration in the grid uh but there's also a case where you want to watch multiple matches at the same time and you could imagine a tennis tournament but the most interested we are seeing right now is for these for these events that inherently have a number of cameras a number of views and the director just can't show you everything that you might want to see what were we looking at give us give us the you know break the technology down into the encoder streaming and player side of all that I will show the the ways that this could be done in principle and in the way that we do it there's a couple of approaches to this and when the first is many encodes which means basically you encode all the possible permutations uh and YouTube does something like this uh but if you have like take an event like formula one right formula one already broadcast Stockbridge cameras they have like 24 feeds you can see the amount of permutations and combinations is going to be completely insane the viewer chooses a feed and they're stuck with that so they can't customize it right now they can choose in that app that can choose either a cockpit camera or they can choose the the direct account they can't choose both at the same time if you wanted to do uh to use this approach the many encodes approach it would be completely impossible another approach is um is the what we call the many decoders approach so you encode all of the the videos individually at a number of different uh ABR levels and resolutions and then you use the fact that most devices have more than one decoder available but not all but most so but the the and that's precisely the issue with this approach is that it's it's hard to to do across device uh you have to adapt to all of the individual devices uh sometimes you have to use software decoding uh which quickly eats a battery and what's also an issue is that syncing is a complete nightmare to keep all of these decoders in sync and also ABR there's the ABR fighting between these uh all of these different decoders so you see the switch up and down and in quality level so that's also not a very good approach who's using this approach so you mentioned YouTube is using the first approach who's using I know apples in this space what are they doing apple is doing this indeed that's that's our understanding and that's why it's only working on Apple devices and then yet another approach is uh is is cloud Edge processing um which means basically you do the interaction on a server and it requires us a separate server per customer so if you've got an event that detects millions of users this is insanely expensive so it's very hard to scale and depending on how you do it the interaction is slow I know there are ways to uh to speeding it up but even if you can make it fast enough then still it it's very very difficult to scale so then obviously this sets it up for uh for our for our approach which we call mosaic multi-view and basically this this uses this styled encoding or actually it uses tile to decoding and what happens here is that we we take all the videos in the resolution that they are on the screen and then we create a single a single frame for it which goes through the hardware the single Hardware decoder on the device and there's some logic in the player that the player knows the resolution that's on the screen so it only retrieves it at the resolution that it's on the screen there's now there's no bandwidth being being wasted and the nice thing about this is that the interaction is completely local everything you saw me do on the screen just now is local composition with videos that are being decoded but as soon as I switched something it actually switches the retrieval uh it's all HTTP streaming so it actually switches what's being retrieved uh from the from the CDM um I think in a nutshell that's that's it let's keep this up um just for the sake of the discussion so what's this look like on the encoding side so I'm I'm bringing in in this case I guess five separate streams on creating five separate encoding ladders is that correct yeah yeah yeah so in in five it actually so this goes up to 10. um if you it is as many as many um as many cameras as you get in you will have encoding letters being uh being encoded and being written to a CDN origin server and all of these things uh the encoding site entails there's a few modifications uh in the encoding that will allow us to do the merging uh client side of the bitstream so what we're actually doing is retrieving Snippets of bit streams and then rewriting these Snippets into a single bit stream it's almost like you're recombining DNA and you're you're feeling that uh that single bit stream through the hardware decoder in the device which is why it's also it's not just bandwidth efficient it's also power efficient so it feels like you're you're doing a lot of hard work in the player I mean what are the CPU requirements for making this happen it's not that bad there is hard work it's all relative uh in the sense what's going on in a player there's a there's a module that defines the strategy that determines uh what videos to retrieve it what bit rates and at what resolutions and then there's a little bit of bitrate bit stream rewriting going on but that's not all that CPU intensive and then the decoding itself again uses the hardware decoder and the device so it's not a software process it's it's not that bad we've been doing we've we've done this we have a customer in India uh and there's some pretty low end devices there and we've done this for VR and it's without uh much of a problem what's the install install base is available decoders for this if I'm a publisher and I want to implement this you know take me through I guess you know I care I care about Smart TVs I care about mobile I care about computer playback where does this work where does not work this relies on the tiling capabilities of hebc or h265 so but HD h.265 is pretty universally deployed

in uh certainly in mobile devices and also in TVs and inside the Boxes Etc and interesting for us it's starting to be pretty universally deployed in web browsers even so we have uh we've never been able to address web but uh we're not planning to uh to have a web player SDK available at the end of the year which is uh which will be a huge step for us it's very interesting but hebc h.265 is is I from my perspective is universally deployed which is which is amazing because I wouldn't have been able to build my business without it so I need atvc decode is that Hardware software either it could be either but in this case uh we always end up using a hardware encoder because it's so universally available so what's this look like on a smart TV I mean is it HTML5 so it just works or do I need a driver or some kind of player for each TV set yeah we haven't done a lot of smart TV uh things yet and uh it does require a fairly uh deep level of integration with the platform so we're taking one at a time um but as you as you suggest with hebc and D Card becoming available in browsers that also opens up new avenues to uh to support TV platforms um and but uh as far as setup boxes go uh there's a lot of Android out there now uh which we could uh which we have deployed to it's it's pretty straight very straightforward I would say not just Premiere it's very straightforward if you have any use cases of this actually working in the field or actually you know trials or anything you can talk about even if you don't identify the publisher uh we've done uh interesting tests I have to be really careful now uh with with uh some 20 feeds uh live with uh something that could really benefit from multiple cameras uh and it it worked like without a hitch was really it was really uh encouraging and I hope to be able to deploy it soon and then talk would be able to talk about it what does the player side look like because if I'm a publisher if I'm Sunday Sunday night TV I guess I guess I'm assuming that people are watching their TV for their main feed so maybe TVs aren't as important as I think they are but you know look at the I guess the US Open is is top of mind today because it's in process uh people see it as a second screen device um but what I've also done what we have shown is casting it works really nicely because the good thing about if you want to do interactive video there's nothing like a mobile device to do the interaction a TV is very hard to to seamlessly interact with right the thing I just did by dragging around videos putting them in my favorite spots enlarging them shrinking them it's it's very very hard to do with remote control even if you have like an LG like smarter Smart Remote but what does work is is casting uh chromecasting or a airplane and it works really nicely and obviously the the thing I showed with the uh the grid is more amenable to uh to interaction on TV sets where you just select and click and enlarge in uh and maybe have a few uh picture-in-pictures available in fixed positions so what's this look like if I'm the publisher you know what I'm you're you know going back to the tennis tournament you know I'm typically for television I'm integrating all my cameras into a single feed so at some point I guess I need to create different feeds and feed them through different encoders you know take that from there yeah every event that we have done be it multiview or VR basically it uses the same pattern people do a mezzanine encode of their camera or their feet uh they use SRT to send it to our Cloud platform we do the transcoding and we egress it to there at CDN origin server um and it's it's very straightforward it's it's a very standard workflow it's very straightforward when we did this with with the event I was talking about it was just there's only a matter of hours uh our partner made the streams available in uh in AWS we picked them up and uh we were up and running so it's yeah as long as you have the individual camera feeds available and and many many organizations do it's very straightforward and it's it's all habc encode in the what's it look like in the cloud are you using software or Hardware transcoders in the cloud are we using a software in the cloud it's interesting if you let me take the example of uh we did the Beijing games and we did uh 8K VR 180 which is mind-boggling resolution it it uh it was higher than any current headset can display in in this case um we had uh let me get this correct we had uh two cameras and we had uh so we do for VR we do a number of different golf versions because we need to be able to switch on every single tile now we had one camera we had three golf versions short intermediate and Loan and then we had ABR levels so in the end there is one single manager that manages 900 literally 900 parallel hebc angles wow and and it spawns them uh uh waits for them to complete then collects the uh collects the results does clever packaging because we do some clever packaging scheme if you if you have your headset that's it here if you have your headset and you move around you will retrieve different tiles from the sphere so you have to be really quick so we do clever packaging so that um adjacent tiles are likely to be available at the CDN Edge cache if you move ahead around But to answer your question there's just a an enormous amount of tiny encoders type tiles I could be 480 by 480 or 640 by 640 or maybe even a bit larger uh just uh churning away at doing encodes and sometimes yeah and then packaging top and then writing into uh a CDN origin server and then the player the playback side is a is a player that they would download and install or yes so the player would have our SDK player which is pretty much a complete AV player uh at the heart of an app an application it provides a nice user experience and our av player is at the heart of it and that handles everything like which tiles do I need to retrieve all the retry all the rewriting and interestingly in the in the case of uh motively you can do some very interesting ABR strategies like my bet went my bandwidth is dropping what you could do with normal video you could just you can only drop the quality of the entire video right here you could say Okay bent with this too low I'll prioritize the main feed or uh maybe I should use uh skills for the thumbnails now let me not retrieve the thumbnails you could do all sorts of clever strategies now it opens up a whole world of newer ABR research almost and you're saying that this is all uh hebc at this point it's all yes it's always you see correct because it relies on tile the child structure you talked about is that going through in in VVC as well is that going to be I guess kept alive or is it BBC is uh is even easier to implement this if you use VVC it's it uh it will be a long while before it's as widely deployed as hebc but it's uh yeah no question we we participated in Impact for a long time uh and we contributed some of our thinking to it that people were already well aware of tiled encoding and all the all the flexibility that was required for this so uh and there was a lot of VR stuff going on in MPX so yes it's it's in BBC as well it's a little harder in av1 because there's some there's some parameters that you can only set at a at the global level whereas in hebc you can set them at the Thai level which makes it easier and uh to do this in agbc talk about the um the standards the role of standards in this market is this is this you know if I'm you know NFL do I go to one publisher I don't really care if it works widely or is there an opportunity to work together with standards that you've you've done so successfully in the past let me say that everything we do is is built from standards uh Leo's uh standard hebc encoding all of our streams are individual decodable certainly if I look at the it's a little harder for for VR but if I look at the multiview stuff any of these individual feeds are just uh You'll Play in an FF play or in VLC no problem uh the packaging your standard MP4 uh we can contributed some stuff to uh to MP4 as well to make it efficient certainly in the case of uh of VR but it's it's all standard MP4 and then all the retrieval is just standard HTTP uh retrieval byte range requests so which is why what makes it so scalable uh by the way um so it's we I think I said before we wouldn't be able to to build our company if it hadn't been for for standards and notably the hebc standard being so widely supported in all in all devices what's the commercial side of this if I'm a publisher you know what um what do I spend money on it depends obviously but you basically you spend your money on a technology license uh you get our cross platform SDK um you can build your apps around it and then depending on what you want to do we will deploy in your cloud account or our transcoders we will use our Cloud transcoders mostly for VR and for uh for multi-view we're seeking to cooperate and we're actually already cooperating with uh with vendor encoding vendors to make their encoders tiling enabled and uh there's this one here one is coming yeah again I can't name names but it's not like VR encoding is really hard uh we we want to control it we want to either do it in our in our Cloud platform or we want uh we want to control what we deploy uh for instance to your AWS account but for multi-view it's it's a lot simpler and it's it's easy to enable third-party encoding vendors to uh to make their encoder standing enabled in a magic all of the retrieval magic all of the sync Magic by the way because we didn't really touch on it but let's all frame accurately synced which we obviously need for vrx so we carry that on to from multiview uh most of our magic is in the client side the the license is that going to relate to the number of players I mean how how is that going to how is that cost going to scale we are too young to give you a standard answer but in general we would like to grow with the success of our customer there's there's basically there's we seek a minimum level of Engagement because otherwise we can't uh pay our developers and then we seek to grow with the success of our customers and there could be active devices or active or a minute streams or those are some parameters so convince me that this is a market that's going to prosper I mean Apple's in it YouTube's in it you know yeah what's Apple Apple wants to sell Hardware so what have you what has Apple shown us that tells us that this Market is going to be quite Dynamic going forward what Apple and Google are telling us is that this is something that people want it's clear that uh uh especially younger Generations have a much more Dynamic way of interacting with with content with video uh with their fans people used to be a fan of the sports club now they are a fan of a player they want to follow that player Maybe so it's that's changing so I think it's a really good fit for a more interactive and in an immersive way of consuming content and and we're starting to see some uh some real traction for it so uh I'm confident it will catch on and and yes YouTube and apple are cases in point they've started to do it will be it a little very inflexible and and well may I call it a little bit primitive we think we have a better solution obviously but uh but it it's being trialed and it's being used and it's interestingly it's being criticized but for its less lack of flexibility uh I heard Dan Rayburn in the podcast recently saying that uh about NFL Sunday Ticket that it only had certain configurations which was a shame okay we've got a question in now's a good time for it what are the bandwidth implications of this approach I mean how are you sending x times this the uh the bits to the viewer or is it still you know a TVP type yeah basically the bandwidth is the sum of all the videos that you're retrieving but we're retrieving them at the resolution that they're being displayed so if you see a thumbnail it's not being retrieved at HD resolution uh if I switched uh remember I did the switch between the thumbnail and the main video it actually switches around uh what's being retrieved and we learned how to switch really fast because we uh we had to do this for VR if you move your head in VR you don't want to wait on a high resolution tile for a second so we did a lot of switching times optimization uh but we actually only retrieved what you see at the resolution that you see if it's how I usually explain it so obviously if I do an HD video and a pick then uh it will increase the bandwidth the PIP but it doesn't increase it unnecessarily and also our software takes uh screen sizes Etc into account so it doesn't retrieve unnecessarily high resolution content it's not X streams times 1080p it's it's a little some multiple of or some small increment over the 1080p yeah and and then thumbnails are packaged together cleverly so uh there's like four thumbnails in in one uh in one small video in one tile it's it's tricks like this that we uh that we use to uh to optimize the bandwidth what's the typical resolution of the of the main display is this 1080p stuff or is it mostly 4K at this point no it's no it's it it's it's 1080P or maybe even lower if you want 720p but uh typically I think uh 1080p is a good one question um what about the Apple headset is that using is that total VR or is that tall media as well what do you know about that I applied for one for uh SDK um uh the good thing about the app so our stuff works on all Apple platforms uh and it has to it also has a standard Hardware decoders it uh interestingly it it's a cooler collaboration with unity all of our headset work uh uses uh relies on Unity so I think and I hope that we should be able to Port our uh our VR stuff to the Apple that's it pretty quickly and the stuff that we did in Beijing it was a production partner cousin uh produced in AK 180. that's exactly the resolution you need interestingly for uh for the Apple headset it's about uh 14.5

K 360 equivalent it's it's interesting but what Apple also does if you allow me to uh to save a few more words is they focus on they really have a focus on this multi-view Paradigm if you look at that the interface it's it's it's sort of a multi-view paradigm so I think it fits it would be an amazing fit and it would be an amazing experience to watch like a golf tournament or uh or or maybe a tennis tournament with all these different matches going on and be able to switch between them Etc in a in a headset you get all these virtual displays right at your disposal what's the middle ground not the um you know augmented reality I guess uh you know where does this fit into that or are they totally different things so we focus on video uh there is uh if you look at the Apple headset specifically division Pro it's a it's also an augmented reality device because you have this see-through thing you have the environment is there and so you can just put virtual screens in your environment it's a good match getting back to the VR Market that's kind of where you started is that ever going to happen and why why has it been so slow it's a longer cycle than people think first headset adoption uh second doing a good production starts making a good user experience related to this being abroad enough audience it's all it's a it's a virtuous cycle right but it's it's only it's happening only very slowly only if there are enough people to go to will it make sense to get production only if you do a good production will enough people go to the headset and uh and start watching games in VR and I'm again I'm focusing on video right um so it's it's it's a cycle that just takes some time to uh to get launched but it's it's happening and what's encouraging is there's a lot of development in headsets and headset quality and again the things that we did uh that we enabled we didn't do them ourselves we did we enabled with our streaming technology in Beijing uh they will look amazing in uh in a vision problem because of the the higher resolution than the current headsets had totally mind-boggling and if I see how much people already today appreciate what sky does with its Premier League matches then uh I'm convinced that this will fly and then you add the social bit because you can talk to your friends you can go to the match you can talk to your friends or also in a headset even if they're remote I'm convinced that this will fly it's just we want it to be deployed faster so we thought uh we'd work on multi-view as well before uh before VR was a it was a big six uh the big the big hit that it will eventually be you tend to talk about sports almost exclusively for both VR and multi-view what are the what are the follow-on uh uh type of Productions that you see is working well with these Technologies yeah Sports is probably one two and three but there's also interest in music no yeah that was not even intended but uh there's also music we have one customer looking at multi-view for uh for doing music education which is really nice they they record a uh an orchestra from different angles and then people can play along and they can switch views Etc so that's one um but there's also pop music and another one is just multi-channel interfaces um right now if you have an EPG if you're it may have pictures uh if you're lucky uh the pictures are relevant and if you're if you're very lucky they're also right in the sense that it's what's actually at on on TV at that moment now if you make an EPG with multi-dew you will just show what's on the channel now you can you can make a grid of your favorite channels it's completely customizable by the way uh so it's I think it has an application in uh in epgs as well so sports music epg's and maybe training uh maybe another uh so just filming things from different angles being able to uh being able to switch to every real close-up but also zoom out uh it's also but Sports still it's it's one two three not just because not just because that's when people what people want to interact with is also because if there's a lot of money being spent on Sports is there any magic on the audio side or is that just simple like you know one feed and and that's another good question um audio obviously is relatively low bit rate uh compared to video uh uh we have in our Even in our demo because there's we have a demo app which is the dial Media Player that people can find in stores but even in our demo uh we have allow options like stay keep the audio with the main feed while you're switching uh auxiliary feeds even if they're large or keep the audio with what you decide to be the main feed or conceivably you could even mix audio if you like uh say on a we have a camera on a motorbike and a director cut you may want to uh to be able to mix the individual audio with the director audio but that's all it's all uh yeah it's all doable and supported have a question on av1 uh do you have av1 ready or is that another development that has to happen before you can actually deploy that no we don't have it ready we've looked at it uh we think it's it's probably feasible but in order for us to deploy it the only reason would be if hebc was no longer as widely supported and everyone was and I'm not seeing it happen anytime soon we've apparently got some people my age in the audience were maybe a little bit younger but they're asking about what was the Magic Bullet that h.264 had and and are we ever going to see that again so I guess going back to your days with the uh the impact uh Forum you know what yeah what happened to me it's that h.264 so it's such a success well it started with the itu and ISO coming together uh and deciding to have one standard which was a uh what is it end of the 90s yes Midway the 90s because there's always there's been this this uh getting back together getting or doing the part again Getting Back Together h.261 H2 so mp2 was nh262 were separate no we're the same and then h63 Olympic four split off again and h.264 and impact 4 part 10 came together again uh so but it started with the world's experts coming together and deciding we we will join our forces and we will make the world's best codec and there was an objective process this this is the the best part of standardization I would say when people compete and work and then work together and create something that's amazing and then I had a a small role I would say when this was already we decided okay so we MP4 is ready what are we gonna do with it now we have to set up some sort of an industry for it started as the MP4 industry Forum that took the standard to the market and then pretty quickly we we found out that the gating Factor would probably be the licensing because uh for impact 2 was clear it was all Hardware there was a a royalty per a chipset and there was but the MP4 it was all going to be software and how do you deal with that so we had a number of workshops with licenses and licensors and uh and the good thing about that was that from certain companies there were both licenses and license Source people that had IP they were also they could also look at it from the licensee site because you need to be reasonable and I think some people at some point got dollar signs in their eyes but at that point we we were able to have a number of workshops to discuss the issues it was not easy because anything if you start to discuss IP you get into antitrust issues ETC so we had lawyers involved Etc but we managed to do that and out of that we let it go something it became licensable basically ABC became and this was a process that took many more years by the way and I think what also helped and and stopped me if I'm going too long uh John but what also helped was that Microsoft and then Senti uh uh published this vc9 uh chronic and then vc1 and I think it puts some pressure on on the ABC license source to get direct together there was real competition interesting so so the fact that there were Alternatives that I guess in in retrospect they never had a chance of succeeding because it was never going to succeed on the broadcast side but that pushed the license the the technology owners to come up with a I mean they went I guess the royalty for mpeg-2 was like two two dollars and fifty cents and it was down to 20 cents for h.264 and I guess you're telling me that uh the justification for that was you had you could do it in software so you know instead of just TV encoders and TV decoders we're looking every browser and every mobile device of course mobile devices didn't exist when uh when h5264 came uh came out no we were working on them we had these projects that we did we had a dream of mobile auto visual terminals Tracy watches nothing they exist now but yes and now I feel like now I feel now I feel old you know you mentioned that standards that was the great thing about standards I mean the negative everybody points to with standards is the royalty side do you see that as a hindrance to you know future generations of technology or do you see that um you know it's just kind of the the how to pay for all the Innovation that you're getting the benefit of I see it as an affected life and nothing comes for free and as long as it's reasonable nobody will complain acbc has gotten a lot of bad press and certainly a lot of it was you know well deserved at the start um but but it's it as you mentioned it is incredibly widely deployed at this time which is why you you have a technology and it's now also in browser so it's I for all intents and purposes it's almost universally deployed right now for all and for all the BET press which I think is a bit overrated well listen we're out of questions and we're out of time uh Rob I appreciate you making time for this uh tell us about what you're going to be showing at IBC and and where people can find you they can contact me uh we don't have a booth so I'm going to be roaming the holes and doing lots of meetings but uh I'll be showing uh what I showed just now um with a few more interaction modes and I'll be uh I'll be telling about how this can revolutionize streaming and and make it uh just a lot more engaging landscape okay I should say that Neden also has a booth at IBC and if you go to our homepage netit.com you'll be able to

uh you know we'll we'll have a pop-up that shows you uh where that Booth is and how to get in touch with our people there Rob again have a great show and thanks for taking the time today thank you John it's been a real pleasure thanks for the insightful questions

2023-09-18 20:43

Show Video

Other news