Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Remapping of hidden area mesh pixels for codec speed-up in remote VR
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0002-0536-7165
Blekinge Institute of Technology, Faculty of Computing, Department of Technology and Aesthetics.
Ericsson Research.
2021 (English)In: 2021 13th International Conference on Quality of Multimedia Experience, QoMEX 2021, Institute of Electrical and Electronics Engineers (IEEE), 2021, p. 207-212, article id 9465408Conference paper, Published paper (Refereed)
Abstract [en]

Rendering VR-content generally requires large image resolutions. This is both due to the display being positioned close to the eyes of the user and to the super-sampling typically used in VR. Due to the requirements of low latency and large resolutions in VR, remote rendering can be difficult to support at sufficient speeds in this medium.In this paper, we propose a method that can reduce the required resolution of non-panoramic VR images from a codec perspective. Because VR images are viewed close-up from within a headset with specific lenses, there are regions of the images that will remain unseen by the user. This unseen area is referred to as the Hidden-Area Mesh (HAM) and makes up 19% of the screen on the HTC Vive VR headset as one example. By remapping the image in a specific manner, we can cut out the HAM, reduce the resolution by the size of the mesh and thus reduce the amount of data that needs to be processed by encoder and decoder. Results from a prototype remote renderer show that by using the proposed Hidden-Area Mesh Remapping (HAMR), an implementation-dependent speed-up of 10-13% in encoding, 17-18% in decoding and 7-11% in total can be achieved while the negative impact on objective image quality in terms of SSIM and VMAF remains small. © 2021 IEEE.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2021. p. 207-212, article id 9465408
Keywords [en]
Decoding, Image quality, Image resolution, Mesh generation, Multimedia systems, Rendering (computer graphics), Signal encoding
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:bth-21381DOI: 10.1109/QoMEX51781.2021.9465408ISI: 000694919800040Scopus ID: 2-s2.0-85113890863ISBN: 9781665435895 (print)OAI: oai:DiVA.org:bth-21381DiVA, id: diva2:1552821
Conference
2021 13th International Conference on Quality of Multimedia Experience (QoMEX),Virtual, Online, 13 June 2021 - 17 June 2021,
Part of project
VIATECH- Human-Centered Computing for Novel Visual and Interactive Applications, Knowledge Foundation
Funder
Knowledge Foundation, 20170056
Note

open access

Available from: 2021-05-06 Created: 2021-05-06 Last updated: 2023-03-21Bibliographically approved
In thesis
1. Remote Rendering for VR
Open this publication in new window or tab >>Remote Rendering for VR
2021 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

The aim of this thesis is to study and advance technology relating to remote

rendering of Virtual Reality (VR). In remote rendering, rendered content is

commonly streamed as video images in network packets from a server to a

client. Experiments are conducted with varying networks and configurations

throughout this work as well as with different technologies that enable or improve

remote VR experiences.

As an introduction to the field, the thesis begins with related studies on

360-video. Here, a statistic based on throughput alone is proposed for use in

light-weight performance monitoring of encrypted HTTPS 360-video streams.

The statistic gives an indication of the potential of stalls in the video stream

which may be of use for network providers wanting to allocate bandwidth

optimally. Moving on from 360-video into real-time remote rendering, a wireless

VR adapter, TPCAST, is studied and a method for monitoring the inputand

video-throughput of this device is proposed and implemented. With the

monitoring tool, it is for example possible to identify video stalls that occur

in TPCAST and thus determine a baseline of its robustness in terms of video

delivery. Having determined the baseline, we move on to developing a prototype

remote rendering system for VR. The prototype has so far been used to study

the bitrate requirements of remote VR and to develop a novel method that

can be used to reduce the image size from a codec-perspective by utilizing the

Hidden Area Mesh (HAM) that is unique to VR. By reducing the image size,

codecs can run faster and time will therefore be saved each frame, potentially

reducing the latency of the system.

Place, publisher, year, edition, pages
Karlskrona: Blekinge Tekniska Högskola, 2021. p. 114
Series
Blekinge Institute of Technology Licentiate Dissertation Series, ISSN 1650-2140 ; 6
Keywords
Remote rendering, VR, Virtual reality
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:bth-21382 (URN)978-91-7295-423-6 (ISBN)
Presentation
2021-06-22, Online, Karlskrona, 14:04 (English)
Opponent
Supervisors
Funder
Knowledge Foundation, 20170056
Available from: 2021-05-06 Created: 2021-05-06 Last updated: 2021-07-01Bibliographically approved
2. Evaluation and Reduction of Temporal Issues in Remote VR
Open this publication in new window or tab >>Evaluation and Reduction of Temporal Issues in Remote VR
2023 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The aim of this thesis is to study and advance knowledge and technologies surrounding remote rendering of Virtual Reality (VR). In particular regarding temporal aspects such as latency and video stalling events. In remote rendering, rendered content is commonly streamed as video images in network packets from a server to a client. The main purpose is to be able to utilize the processing power available in stationary machines on thin clients that are otherwise limited by weight and size due to their mobility requirements. Achieving this process in real-time with excellent quality is not trivial in interactive VR due to the requirements on low latency and high visual fidelity. The dissertation brings to light the main challenges of the field as well as a set of new proposals and knowledge on the topic. 

As an introduction to the field, the dissertation begins with a study on 360-video streaming, which is a form of VR but less interactive. Moving on into real-time remote rendering, a commercial wireless VR adapter is studied and a method for monitoring its data traffic is proposed and implemented. The monitoring is able to provide a baseline in terms of video stalling events in a commercial remote-VR product. Moving on, a prototype remote renderer for VR is implemented using a proposed architecture, it is furthermore tested in various network conditions to determine under which conditions such remote rendering may be viable. Having constructed the remote renderer, a study is conducted that shows the effect of headset movements on the resulting video bitrate requirements in remote VR. Furthermore, a method that can reduce the codec image size in remote VR is proposed and its viability is tested with the prototype. Finally, two works are reported, in which human participants are involved, one for studying the subjective effects of video stalls in VR and one for studying the objective effects of hand-controller latency on aiming accuracy in VR.

Place, publisher, year, edition, pages
Karlskrona: Blekinge Tekniska Högskola, 2023. p. 191
Series
Blekinge Institute of Technology Doctoral Dissertation Series, ISSN 1653-2090 ; 4
Keywords
Remote Rendering VR Network
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:bth-24388 (URN)978-91-7295-453-3 (ISBN)
Public defence
2023-05-08, J1630, Valhallavägen 1, Karlskrona, 09:15 (English)
Opponent
Supervisors
Funder
Knowledge Foundation, 20170056
Available from: 2023-03-21 Created: 2023-03-21 Last updated: 2023-04-18Bibliographically approved

Open Access in DiVA

fulltext(8901 kB)230 downloads
File information
File name FULLTEXT01.pdfFile size 8901 kBChecksum SHA-512
14eae6117c3b27951a2cc9368ebda8659f423db899db1d118ece2816eb6df63b9961ee51915e835d59f79766ef49ef2d59b58bf34592f37ca4c1ff4ccfda9562
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Authority records

Kelkkanen, ViktorFiedler, Markus

Search in DiVA

By author/editor
Kelkkanen, ViktorFiedler, Markus
By organisation
Department of Computer ScienceDepartment of Technology and Aesthetics
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 230 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 191 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf