Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Bitrate Requirements of Non-Panoramic VR Remote Rendering
Blekinge Institute of Technology, Faculty of Computing.ORCID iD: 0000-0002-0536-7165
Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
Ericsson Research.
2020 (English)In: MM 2020 - Proceedings of the 28th ACM International Conference on Multimedia, ACM Publications, 2020Conference paper, Published paper (Refereed)
Abstract [en]

This paper shows the impact of bitrate settings on objective quality measures when streaming non-panoramic remote-rendered Virtual Reality (VR) images. Non-panoramic here refers to the images that are rendered and sent across the network, they only cover the viewport of each eye, respectively.

To determine the required bitrate of remote rendering for VR, we use a server that renders a 3D-scene, encodes the resulting images using the NVENC H.264 codec and transmits them to the client across a network. The client decodes the images and displays them in the VR headset. Objective full-reference quality measures are taken by comparing the image before encoding on the server to the same image after it has been decoded on the client. By altering the average bitrate setting of the encoder, we obtain objective quality scores as functions of bitrates. Furthermore, we study the impact of headset rotation speeds, since this will also have a large effect on image quality.

We determine an upper and lower bitrate limit based on headset rotation speeds. The lower limit is based on a speed close to the average human peak head-movement speed, 360°s. The upper limit is based on maximal peaks of 1080°s. Depending on the expected rotation speeds of the specific application, we determine that a total of 20--38Mbps should be used at resolution 2160×1200@90,fps, and 22--42Mbps at 2560×1440@60,fps. The recommendations are given with the assumption that the image is split in two and streamed in parallel, since this is how the tested prototype operates.

Place, publisher, year, edition, pages
ACM Publications, 2020.
Keywords [en]
6-dof, game streaming, low-latency, remote rendering, ssim, vmaf
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:bth-21380DOI: 10.1145/3394171.3413681ISI: 000810735003076ISBN: 978-1-4503-7988-5 (print)OAI: oai:DiVA.org:bth-21380DiVA, id: diva2:1552814
Conference
MM '20: The 28th ACM International Conference on Multimedia Seattle WA USA October, 2020
Part of project
VIATECH- Human-Centered Computing for Novel Visual and Interactive Applications, Knowledge Foundation
Funder
Knowledge Foundation, 20170056
Note

open access

Available from: 2021-05-06 Created: 2021-05-06 Last updated: 2023-03-21Bibliographically approved
In thesis
1. Remote Rendering for VR
Open this publication in new window or tab >>Remote Rendering for VR
2021 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

The aim of this thesis is to study and advance technology relating to remote

rendering of Virtual Reality (VR). In remote rendering, rendered content is

commonly streamed as video images in network packets from a server to a

client. Experiments are conducted with varying networks and configurations

throughout this work as well as with different technologies that enable or improve

remote VR experiences.

As an introduction to the field, the thesis begins with related studies on

360-video. Here, a statistic based on throughput alone is proposed for use in

light-weight performance monitoring of encrypted HTTPS 360-video streams.

The statistic gives an indication of the potential of stalls in the video stream

which may be of use for network providers wanting to allocate bandwidth

optimally. Moving on from 360-video into real-time remote rendering, a wireless

VR adapter, TPCAST, is studied and a method for monitoring the inputand

video-throughput of this device is proposed and implemented. With the

monitoring tool, it is for example possible to identify video stalls that occur

in TPCAST and thus determine a baseline of its robustness in terms of video

delivery. Having determined the baseline, we move on to developing a prototype

remote rendering system for VR. The prototype has so far been used to study

the bitrate requirements of remote VR and to develop a novel method that

can be used to reduce the image size from a codec-perspective by utilizing the

Hidden Area Mesh (HAM) that is unique to VR. By reducing the image size,

codecs can run faster and time will therefore be saved each frame, potentially

reducing the latency of the system.

Place, publisher, year, edition, pages
Karlskrona: Blekinge Tekniska Högskola, 2021. p. 114
Series
Blekinge Institute of Technology Licentiate Dissertation Series, ISSN 1650-2140 ; 6
Keywords
Remote rendering, VR, Virtual reality
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:bth-21382 (URN)978-91-7295-423-6 (ISBN)
Presentation
2021-06-22, Online, Karlskrona, 14:04 (English)
Opponent
Supervisors
Funder
Knowledge Foundation, 20170056
Available from: 2021-05-06 Created: 2021-05-06 Last updated: 2021-07-01Bibliographically approved
2. Evaluation and Reduction of Temporal Issues in Remote VR
Open this publication in new window or tab >>Evaluation and Reduction of Temporal Issues in Remote VR
2023 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The aim of this thesis is to study and advance knowledge and technologies surrounding remote rendering of Virtual Reality (VR). In particular regarding temporal aspects such as latency and video stalling events. In remote rendering, rendered content is commonly streamed as video images in network packets from a server to a client. The main purpose is to be able to utilize the processing power available in stationary machines on thin clients that are otherwise limited by weight and size due to their mobility requirements. Achieving this process in real-time with excellent quality is not trivial in interactive VR due to the requirements on low latency and high visual fidelity. The dissertation brings to light the main challenges of the field as well as a set of new proposals and knowledge on the topic. 

As an introduction to the field, the dissertation begins with a study on 360-video streaming, which is a form of VR but less interactive. Moving on into real-time remote rendering, a commercial wireless VR adapter is studied and a method for monitoring its data traffic is proposed and implemented. The monitoring is able to provide a baseline in terms of video stalling events in a commercial remote-VR product. Moving on, a prototype remote renderer for VR is implemented using a proposed architecture, it is furthermore tested in various network conditions to determine under which conditions such remote rendering may be viable. Having constructed the remote renderer, a study is conducted that shows the effect of headset movements on the resulting video bitrate requirements in remote VR. Furthermore, a method that can reduce the codec image size in remote VR is proposed and its viability is tested with the prototype. Finally, two works are reported, in which human participants are involved, one for studying the subjective effects of video stalls in VR and one for studying the objective effects of hand-controller latency on aiming accuracy in VR.

Place, publisher, year, edition, pages
Karlskrona: Blekinge Tekniska Högskola, 2023. p. 191
Series
Blekinge Institute of Technology Doctoral Dissertation Series, ISSN 1653-2090 ; 4
Keywords
Remote Rendering VR Network
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:bth-24388 (URN)978-91-7295-453-3 (ISBN)
Public defence
2023-05-08, J1630, Valhallavägen 1, Karlskrona, 09:15 (English)
Opponent
Supervisors
Funder
Knowledge Foundation, 20170056
Available from: 2023-03-21 Created: 2023-03-21 Last updated: 2023-04-18Bibliographically approved

Open Access in DiVA

Bitrate paper(5078 kB)586 downloads
File information
File name FULLTEXT01.pdfFile size 5078 kBChecksum SHA-512
f1c1087543bf5e536973571fce0d8fb64f481b24c7e0a4e55d8f4748b1639013fdba2be88c938b7066f63e6fd7071acdabde990872477c0fdb0dfd5088df0c1e
Type fulltextMimetype application/pdf

Other links

Publisher's full textPublisher Homepage

Authority records

Kelkkanen, ViktorFiedler, Markus

Search in DiVA

By author/editor
Kelkkanen, ViktorFiedler, Markus
By organisation
Faculty of ComputingDepartment of Technology and Aesthetics
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 586 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 272 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf