Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Remote Rendering for VR
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0002-0536-7165
2021 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

The aim of this thesis is to study and advance technology relating to remote

rendering of Virtual Reality (VR). In remote rendering, rendered content is

commonly streamed as video images in network packets from a server to a

client. Experiments are conducted with varying networks and configurations

throughout this work as well as with different technologies that enable or improve

remote VR experiences.

As an introduction to the field, the thesis begins with related studies on

360-video. Here, a statistic based on throughput alone is proposed for use in

light-weight performance monitoring of encrypted HTTPS 360-video streams.

The statistic gives an indication of the potential of stalls in the video stream

which may be of use for network providers wanting to allocate bandwidth

optimally. Moving on from 360-video into real-time remote rendering, a wireless

VR adapter, TPCAST, is studied and a method for monitoring the inputand

video-throughput of this device is proposed and implemented. With the

monitoring tool, it is for example possible to identify video stalls that occur

in TPCAST and thus determine a baseline of its robustness in terms of video

delivery. Having determined the baseline, we move on to developing a prototype

remote rendering system for VR. The prototype has so far been used to study

the bitrate requirements of remote VR and to develop a novel method that

can be used to reduce the image size from a codec-perspective by utilizing the

Hidden Area Mesh (HAM) that is unique to VR. By reducing the image size,

codecs can run faster and time will therefore be saved each frame, potentially

reducing the latency of the system.

Place, publisher, year, edition, pages
Karlskrona: Blekinge Tekniska Högskola, 2021. , p. 114
Series
Blekinge Institute of Technology Licentiate Dissertation Series, ISSN 1650-2140 ; 6
Keywords [en]
Remote rendering, VR, Virtual reality
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:bth-21382ISBN: 978-91-7295-423-6 (print)OAI: oai:DiVA.org:bth-21382DiVA, id: diva2:1552832
Presentation
2021-06-22, Online, Karlskrona, 14:04 (English)
Opponent
Supervisors
Funder
Knowledge Foundation, 20170056Available from: 2021-05-06 Created: 2021-05-06 Last updated: 2021-07-01Bibliographically approved
List of papers
1. Coefficient of Throughput Variation as Indication of Playback Freezes in Streamed Omnidirectional Videos
Open this publication in new window or tab >>Coefficient of Throughput Variation as Indication of Playback Freezes in Streamed Omnidirectional Videos
2018 (English)In: 2018 28TH INTERNATIONAL TELECOMMUNICATION NETWORKS AND APPLICATIONS CONFERENCE (ITNAC), IEEE , 2018, p. 392-397Conference paper, Published paper (Refereed)
Abstract [en]

A large portion of today's network traffic consists of streamed video of large variety, such as films, television shows, live-streamed games and recently omnidirectional videos. A common way of delivering video is by using Dynamic Adaptive Streaming over HTTP (DASH), or recently with encrypted HTTPS. Encrypted video streams disable the use of Quality of Service (QoS) systems that rely on knowledge of application-dependent data, such as video resolution and bit-rate. This could make it difficult for a party providing bandwidth to efficiently allocate resources and estimate customer satisfaction. An application-independent way of measuring video stream quality could be of interest for such a party. In this paper, we investigate encrypted streaming of omni-directional video via YouTube to a smartphone in a Google Cardboard VR-headset. We monitored such sessions, delivered via both WiFi and mobile networks, at different times of day, implying different levels of congestion, and characterised the network traffic by using the Coefficient of Throughput Variation (CoTV) as statistic. We observe that this statistic shows to be able to indicate whether a stream is stable or unstable, in terms of potential video playback freezes, when the DASH delivery strategy is used.

Place, publisher, year, edition, pages
IEEE, 2018
Keywords
Virtual Reality, 360-videos, video streaming, Quality of Experience, video freezes, throughput statistics
National Category
Communication Systems Telecommunications
Identifiers
urn:nbn:se:bth-17728 (URN)10.1109/ATNAC.2018.8615312 (DOI)000459862300072 ()2-s2.0-85062191313 (Scopus ID)978-1-5386-7177-1 (ISBN)
Conference
28th International Telecommunication Networks and Applications Conference (ITNAC), Sydney, NOV 21-23
Available from: 2019-03-21 Created: 2019-03-21 Last updated: 2023-03-21Bibliographically approved
2. A Test-bed for Studies of Temporal Data Delivery Issues in a TPCAST Wireless Virtual Reality Set-up
Open this publication in new window or tab >>A Test-bed for Studies of Temporal Data Delivery Issues in a TPCAST Wireless Virtual Reality Set-up
2018 (English)In: 2018 28TH INTERNATIONAL TELECOMMUNICATION NETWORKS AND APPLICATIONS CONFERENCE (ITNAC), IEEE , 2018, p. 404-406Conference paper, Published paper (Refereed)
Abstract [en]

Virtual Reality (VR) is becoming increasingly popular, and wireless cable replacements unleash the user of VR Head Mounted Displays (HMD) from the rendering desktop computer. However, the price to pay for additional freedom of movement is a higher sensitivity of the wireless solution to temporal disturbances of both video frame and input traffic delivery, as compared to its wired counterpart. This paper reports on the development of a test-bed to be used for studying temporal delivery issues of both video frames and input traffic in a wireless VR environment, here using TPCAST with a HTC Vive headset. We provide a solution for monitoring and recording of traces of (1) video frame freezes as observed on the wireless VR headset, and (2) input traffic from the headset and hand controls to the rendering computer. So far, the test-bed illustrates the resilience of the underlying WirelesslID technology and TCP connections that carry the input traffic, and will be used in future studies of Quality of Experience (QoE) in wireless desktop VR.

Place, publisher, year, edition, pages
IEEE, 2018
Keywords
Virtual Reality, Quality of Experience, Video Freezes: Wireless, Head-Mounted Display, Monitoring, Recording: Tools
National Category
Communication Systems Telecommunications
Identifiers
urn:nbn:se:bth-17729 (URN)10.1109/ATNAC.2018.8615297 (DOI)000459862300074 ()2-s2.0-85062173499 (Scopus ID)978-1-5386-7177-1 (ISBN)
Conference
28th International Telecommunication Networks and Applications Conference (ITNAC),Sydney, NOV 21-23
Available from: 2019-03-21 Created: 2019-03-21 Last updated: 2023-03-21Bibliographically approved
3. Bitrate Requirements of Non-Panoramic VR Remote Rendering
Open this publication in new window or tab >>Bitrate Requirements of Non-Panoramic VR Remote Rendering
2020 (English)In: MM 2020 - Proceedings of the 28th ACM International Conference on Multimedia, ACM Publications, 2020Conference paper, Published paper (Refereed)
Abstract [en]

This paper shows the impact of bitrate settings on objective quality measures when streaming non-panoramic remote-rendered Virtual Reality (VR) images. Non-panoramic here refers to the images that are rendered and sent across the network, they only cover the viewport of each eye, respectively.

To determine the required bitrate of remote rendering for VR, we use a server that renders a 3D-scene, encodes the resulting images using the NVENC H.264 codec and transmits them to the client across a network. The client decodes the images and displays them in the VR headset. Objective full-reference quality measures are taken by comparing the image before encoding on the server to the same image after it has been decoded on the client. By altering the average bitrate setting of the encoder, we obtain objective quality scores as functions of bitrates. Furthermore, we study the impact of headset rotation speeds, since this will also have a large effect on image quality.

We determine an upper and lower bitrate limit based on headset rotation speeds. The lower limit is based on a speed close to the average human peak head-movement speed, 360°s. The upper limit is based on maximal peaks of 1080°s. Depending on the expected rotation speeds of the specific application, we determine that a total of 20--38Mbps should be used at resolution 2160×1200@90,fps, and 22--42Mbps at 2560×1440@60,fps. The recommendations are given with the assumption that the image is split in two and streamed in parallel, since this is how the tested prototype operates.

Place, publisher, year, edition, pages
ACM Publications, 2020
Keywords
6-dof, game streaming, low-latency, remote rendering, ssim, vmaf
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:bth-21380 (URN)10.1145/3394171.3413681 (DOI)000810735003076 ()978-1-4503-7988-5 (ISBN)
Conference
MM '20: The 28th ACM International Conference on Multimedia Seattle WA USA October, 2020
Funder
Knowledge Foundation, 20170056
Note

open access

Available from: 2021-05-06 Created: 2021-05-06 Last updated: 2023-03-21Bibliographically approved
4. Remapping of hidden area mesh pixels for codec speed-up in remote VR
Open this publication in new window or tab >>Remapping of hidden area mesh pixels for codec speed-up in remote VR
2021 (English)In: 2021 13th International Conference on Quality of Multimedia Experience, QoMEX 2021, Institute of Electrical and Electronics Engineers (IEEE), 2021, p. 207-212, article id 9465408Conference paper, Published paper (Refereed)
Abstract [en]

Rendering VR-content generally requires large image resolutions. This is both due to the display being positioned close to the eyes of the user and to the super-sampling typically used in VR. Due to the requirements of low latency and large resolutions in VR, remote rendering can be difficult to support at sufficient speeds in this medium.In this paper, we propose a method that can reduce the required resolution of non-panoramic VR images from a codec perspective. Because VR images are viewed close-up from within a headset with specific lenses, there are regions of the images that will remain unseen by the user. This unseen area is referred to as the Hidden-Area Mesh (HAM) and makes up 19% of the screen on the HTC Vive VR headset as one example. By remapping the image in a specific manner, we can cut out the HAM, reduce the resolution by the size of the mesh and thus reduce the amount of data that needs to be processed by encoder and decoder. Results from a prototype remote renderer show that by using the proposed Hidden-Area Mesh Remapping (HAMR), an implementation-dependent speed-up of 10-13% in encoding, 17-18% in decoding and 7-11% in total can be achieved while the negative impact on objective image quality in terms of SSIM and VMAF remains small. © 2021 IEEE.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2021
Keywords
Decoding, Image quality, Image resolution, Mesh generation, Multimedia systems, Rendering (computer graphics), Signal encoding
National Category
Computer Sciences
Identifiers
urn:nbn:se:bth-21381 (URN)10.1109/QoMEX51781.2021.9465408 (DOI)000694919800040 ()2-s2.0-85113890863 (Scopus ID)9781665435895 (ISBN)
Conference
2021 13th International Conference on Quality of Multimedia Experience (QoMEX),Virtual, Online, 13 June 2021 - 17 June 2021,
Funder
Knowledge Foundation, 20170056
Note

open access

Available from: 2021-05-06 Created: 2021-05-06 Last updated: 2023-03-21Bibliographically approved

Open Access in DiVA

fulltext(73494 kB)635 downloads
File information
File name FULLTEXT03.pdfFile size 73494 kBChecksum SHA-512
24ed9e9f017a4f68ab2f60af121bd61609386e75d810aecbed73a41481e701e00dcca41bffeb3d681a9f8f0157d9ab10196f08ba7231c1d044ed009373cfdd38
Type fulltextMimetype application/pdf

Authority records

Kelkkanen, Viktor

Search in DiVA

By author/editor
Kelkkanen, Viktor
By organisation
Department of Computer Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 652 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 1081 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf