The augmented and virtual reality (AR/VR) market was estimated to be worth $12 billion last year and, according to latest market reports, it’s expected to top $192bn by 2022.
Although growing year-on-year, and with consumer spending making up the single largest portion of the AR/VR market, adoption of AR/VR remains low compared to other consumer electronics. Despite that, tech giants such as Intel, Sony and Microsoft are still investing and believe that the sector holds significant promise for the future.
The BBC is among those investing in this technology and its R&D department is looking at ways to utilise AR/VR to deliver content. In fact, it has created a dedicated ‘hub’ made up of producers and commissioners tasked with creating virtual-based content.
Using the BBC website it is possible to download content for devices such as Oculus Rift, Google Daydream and Samsung Gear VR. Stories include an interactive fairytale and a visualisation of the migration experience from Syria to Turkey, amongst others. Another is a virtual Doctor Who experience which will be premiered at the upcoming Tribeca Film Fest. “The opportunity for entertainment is almost limitless,” said David Johnston, Senior Product Manager, AR/VR BBC R&D. “All the way from immersive drama to augmented gallery experiences.”
Location-based experiences are a huge focus for the BBC.
“They are becoming more mainstream and over the next two years will become more popular. Location scenarios don’t require the user to hold or wear the device for long periods of time, so things like comfort, currently limiting the social acceptance of AR/VR, is less important.”
As a result, Johnston suggests that head mounted displays becoming acceptable in the home or as something that can be worn all day is some time away. He also anticipates that the main applications will encompass education and utility.
“I see it removing language barriers,” he suggested. “We have the technology to do this now, but it isn’t seamless – it’s awkward to hold a device between yourself and someone else, for example. The technology needs to mature.”
He also added that having increased connectivity to devices enabled by 5G, as well as edge and cloud computing could change the way in which this content is delivered.
“At the moment,” he said, “we’re reliant on cables attached to an expensive PC or a lower powered mobile device.
BBC R&D teamed up with Aardman Animations to develop short animated content that showed distinct moments of the Roman Bath’s history, including when the Great Bath fell into disrepair |
“The BBC is looking into the possibility of streaming content from the cloud or the edge compute to a mobile device, allowing for high quality graphics and interactions without all the work having to be done on the device.”
Streaming content
Recently the BBC explored this concept with partners in the West of England Combined Authority’s 5G Smart Tourism project.
For the trial, the BBC created an app that enabled visitors to walk around the Roman Baths and see how they looked at three different points of time - when the thermal spring was discovered, when the Great Bath fell into disrepair and ruin, and the Victorian restoration by Major Charles Davis.
In the initial phase, BBC R&D investigated the idea of delivering the scenes as pre-rendered 360-degree video (4k x 2k resolution). The scenes were encoded using H264 at several bit rates in the range 5-40Mbit/s to investigate the trade-off between picture quality and loading/buffering time. A rate of 10Mbit/s was selected.
“Conventional mobile networks (and many wi-fi installations) would struggle with the higher rates, particularly when there are around 20 simultaneous users in the same area, which is a realistic number for a site such as the Baths,” according to the BBC R&D department.
The first trials ran in December 2018, with the video hosted on edge servers in the 5G core network that was set up by Bristol University. The augmented experience enabled users to look around the model with 3 degrees of freedom (DoF) – left, right, up and down – but they were unable to move around the space freely.
The core network was linked to the Baths using a 60GHz mesh network provided by its project partner CCS (Cambridge Communication Systems) the creator of Metnet – a self-organising mmWave mesh for Gigabit backhaul and access applications.
The final link to the handsets used wi-fi, as 5G-enabled consumer handsets weren’t available. Technical data such as the loading/buffering times were gathered during the demo, along with how the users explored the scenes and what they thought of the experience as a whole.
Once the BBC had established that this experience was appealing, the R&D department looked at ways to improve the experience further.
The second set or trials ran in March this year and explored the idea of remote rendering technology.
To deliver this enhanced experience BBC R&D worked with Aardman Animations to produce three high-quality 3D models which represented these selected time periods.
These models were developed using the Unity game engine, enabling animations and personalised interactions with the models.
To give more freedom to the user, the team decided the experience would also require 6 DoF. This was achieved using Google’s ARCore platform which deciphers how much the user is moving by detecting visually distinctive features from the camera image called feature points. It then uses these feature points combined with data from the accelerometer and gyroscope sensors to calculate the device’s change in position and orientation.
To align the 3D model with the physical environment, pictures (that triggered the augmented experience to commence) were placed at specific locations. Using Google ARCore platform to identify the image, the system could work out where the user’s phone was relative to that market in both the physical and virtual worlds.
“Once we know where you are, that data is sent to the server,” James Gibson, a BBC R&D engineer, explained. “To get the visuals back to the user, we used NVIDIA’s Capture SDK which allowed us to capture what had been rendered by its GPU and then encode that as a video.
“We then streamed that encoded video back to the phone. It does this with really low latency.”
“Location-based VR and AR experiences are a huge focus for the BBC and in the next two years will become more popular.” David Johnston |
The key technological requirement for this application was to keep the delay between a user moving their phone and the view on their screen updating to an absolute minimum. “To this end, we used a 60 GHz, 5G enabling mesh network to connect it with the physical server cluster at Bristol University which performed the remote rendering, although the final ‘hop’ between the 5G network and the smartphones was done via either an LTE or WiFi connection,” he explained. “Moving the compute physically closer to the end user means that the latency is minimised.”
Rendering the 3D model remotely has a number of advantages over rendering the 3D model on the mobile device, Gibson added.
“There is no large download required before you can start the experience, the content is streamed on demand, almost instantly. The model can also be much higher quality, even the best mobile GPU wouldn’t be able to render the model at the quality seen during the trial.
It also reduces the battery consumption by moving the computational expensive process of rendering a 3D model to a remote server.”
But one of the key benefits, he told NE, is that it enables what he described as “a device agnostic experience, because the device acts as a thin client”. In other words, all users should have the same experience regardless of device, the only device requirements are support ARCore and the ability to stream video.
Despite only being a trial, the results of the Smart Tourism project demonstrated great promise for the future of immersive experiences. And, with the BBC’s hub of virtual content set to grow, who knows how entertainment experiences will evolve.