Shopping Cart

No products in the cart.

IEEE 3333.1.3-2022

$42.79

IEEE Standard for the Deep Learning-Based Assessment of Visual Experience Based on Human Factors

Published By Publication Date Number of Pages
IEEE 2022
Guaranteed Safe Checkout
Category:

If you have any questions, feel free to reach out to our online customer service team by clicking on the bottom right corner. We’re here to assist you 24/7.
Email:[email protected]

New IEEE Standard – Active. Measuring quality of experience (QoE) aims to explore the factors that contribute to a user’s perceptual experience including human, system, and context factors. Since QoE stems from human interaction with various devices, the estimation should be started by investigating the mechanism of human visual perception. Therefore, measuring QoE is still a challenging task. In this standard, QoE assessment is categorized into two subcategories which are perceptual quality and virtual reality (VR) cybersickness. In addition, deep learning models considering human factors for various QoE assessments are covered, along with a reliable subjective test methodology and a database construction procedure.

PDF Catalog

PDF Pages PDF Title
1 Front Cover
2 Title page
4 Important Notices and Disclaimers Concerning IEEE Standards Documents
8 Participants
9 Introduction
10 Contents
11 1. Overview
1.1 Scope
1.2 Word usage
12 2. Normative references
3. Definitions, acronyms, and abbreviations
3.1 Definitions
14 3.2 Acronyms and abbreviations
16 4. Synopsis of the standard
4.1 General
4.2 Quality of experience assessment for 2D, 3D, and VR/AR contents
4.3 A database of immersive contents
5. Quality assessment of visual contents
5.1 General
17 5.2 Quality assessment of panoramic 360 contents
5.2.1 Quality assessment of panoramic 360 video based on 3D-CNN
18 5.2.2 Quality assessment of panoramic 360 videos using saliency information
19 5.3 Quality assessment of 3D contents
5.3.1 General
5.3.2 Model-based approaches
5.3.2.1 Geometric Laplacian
20 5.3.2.2 Curvature-based feature
21 5.3.2.3 Roughness-based feature
5.3.2.4 Combining geometry, color, and attention-based feature
22 5.3.3 Image-based approaches
5.4 Saliency prediction
5.4.1 Saliency prediction on panoramic 360 contents
23 5.4.2 Saliency prediction on 3D contents
5.4.2.1 Geometry-based saliency models
24 5.4.2.2 Viewpoint based saliency models
25 5.4.3 Saliency prediction on immersive contents
5.4.3.1 Binocular information
5.4.3.2 Content information
26 5.4.3.3 Disparity information
27 5.4.3.4 Saliency prediction network
29 6. Cybersickness assessment of visual contents
6.1 General
6.2 Human factor mechanism of cybersickness
6.2.1 An integrated model of human motion perception
30 6.2.2 A theory on visually induced motion sickness
32 6.3 Cybersickness prediction on VR contents
6.3.1 General
6.3.2 Cybersickness predictor: analysis of visual-vestibular conflict
6.3.2.1 Perceptual motion features
33 6.3.2.2 Statistical content features
6.3.2.3 Temporal pooling
35 6.3.3 Cybersickness predictor: integrated analysis of sickness and presence
6.3.3.1 Cybersickness predictor-analysis of natural video statistics
36 6.3.3.2 Presence predictor-analysis of natural video statistics
37 6.3.4 Cybersickness predictor: Analysis of neurological mechanism
6.3.4.1 Neurological representation
38 6.3.4.2 Spatio-temporal representation
6.3.4.3 EEG data acquisition
39 6.3.4.4 Cognitive representation learning
6.3.4.5 Cybersickness learning
40 7. Database of immersive contents
7.1 General
7.2 Description of the proposed immersive contents database
7.2.1 Source contents: Stimuli
41 7.2.2 Projection of panoramic 360 contents
42 7.3 Subjective assessment
7.3.1 Subjective scoring method
7.3.1.1 Design
43 7.3.1.2 Subjects
7.3.1.3 Procedure
44 7.3.2 Eye-tracking method
7.3.2.1 Eye-tracking procedure
7.4 Results and discussion
47 Annex A (informative) Bibliography
51 Back Cover
IEEE 3333.1.3-2022
$42.79