Title: NRspttemVQA: Real-Time Video Quality Assessment Based on the User’s Visual Perception
Abstract: There is a strong need for non-reference video quality metrics for user-generated video content to prevent loss of video quality caused by distortion during recording, compression, and signal transmission. Here we contribute to advancing the issue of streaming quality by creating a large-scale dataset with video compression and transmission artefacts. Our final dataset consists of 4.1 million video quality perceptual thresholds by users. We also created a new first non-reference video quality metric that includes the psychophysical features of the user’s video experience, which provides stability in predicting the user’s subjective rating of a video. Our experimental results show that the proposed video quality metric achieves the most stable performance on three independent video datasets. We believe our study will expand further research into deep learning-based video quality metrics modelling.
Publication Year: 2023
Publication Date: 2023-11-29
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 1
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot