TY - JOUR
T1 - Perceptual Quality Assessment of 360° Images Based on Generative Scanpath Representation
AU - Sui, Xiangjie
AU - Zhu, Hanwei
AU - Liu, Xuelin
AU - Fang, Yuming
AU - Wang, Shiqi
AU - Wang, Zhou
N1 - Publisher Copyright:
© 1992-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Despite substantial efforts dedicated to the design of heuristic models for omnidirectional (i.e., 360°) image quality assessment (OIQA), a conspicuous gap remains due to the lack of consideration for the diversity of viewing behaviors that leads to the varying perceptual quality of 360° images. Two critical aspects underline this oversight: the neglect of viewing conditions that significantly sway user gaze patterns and the overreliance on a single viewport sequence from the 360° image for quality inference. To address these issues, we introduce a unique generative scanpath representation (GSR) for effective quality inference of 360° images, which aggregates varied perceptual experiences of multi-hypothesis users under a predefined viewing condition. More specifically, given a viewing condition characterized by the starting point of viewing and exploration time, a set of scanpaths consisting of dynamic visual fixations can be produced using an apt scanpath generator. Following this vein, we use the scanpaths to convert the 360° image into the unique GSR, which provides a global overview of gazed-focused contents derived from scanpaths. As such, the quality inference of the 360° image is swiftly transformed to that of GSR. We then propose an efficient OIQA computational framework by learning the quality maps of GSR. Comprehensive experimental results validate that the predictions of the proposed framework are highly consistent with human perception in the spatiotemporal domain, especially in the challenging context of locally distorted 360° images under varied viewing conditions.
AB - Despite substantial efforts dedicated to the design of heuristic models for omnidirectional (i.e., 360°) image quality assessment (OIQA), a conspicuous gap remains due to the lack of consideration for the diversity of viewing behaviors that leads to the varying perceptual quality of 360° images. Two critical aspects underline this oversight: the neglect of viewing conditions that significantly sway user gaze patterns and the overreliance on a single viewport sequence from the 360° image for quality inference. To address these issues, we introduce a unique generative scanpath representation (GSR) for effective quality inference of 360° images, which aggregates varied perceptual experiences of multi-hypothesis users under a predefined viewing condition. More specifically, given a viewing condition characterized by the starting point of viewing and exploration time, a set of scanpaths consisting of dynamic visual fixations can be produced using an apt scanpath generator. Following this vein, we use the scanpaths to convert the 360° image into the unique GSR, which provides a global overview of gazed-focused contents derived from scanpaths. As such, the quality inference of the 360° image is swiftly transformed to that of GSR. We then propose an efficient OIQA computational framework by learning the quality maps of GSR. Comprehensive experimental results validate that the predictions of the proposed framework are highly consistent with human perception in the spatiotemporal domain, especially in the challenging context of locally distorted 360° images under varied viewing conditions.
KW - Omnidirectional images
KW - perceptual quality assessment
KW - virtual reality
UR - https://www.scopus.com/pages/publications/105010943803
U2 - 10.1109/TIP.2025.3583181
DO - 10.1109/TIP.2025.3583181
M3 - 文章
C2 - 40663675
AN - SCOPUS:105010943803
SN - 1057-7149
VL - 34
SP - 4485
EP - 4499
JO - IEEE Transactions on Image Processing
JF - IEEE Transactions on Image Processing
ER -