Comparison of crowdsourcing and laboratory settings for subjective assessment of video quality and acceptability & annoyance - INRIA - Institut National de Recherche en Informatique et en Automatique
Communication Dans Un Congrès Année : 2024

Comparison of crowdsourcing and laboratory settings for subjective assessment of video quality and acceptability & annoyance

Résumé

User satisfaction is significantly influenced by their expectations of video quality. Even when users are presented with identical video stimuli, the Quality of Experience (QoE) can vary based on the context. The acceptability and annoyance paradigm serves as a tool to understand this relationship by measuring QoE as a function of user expectations and video quality. Traditionally, subjective experiments assessing QoE have been conducted in controlled laboratory settings. While the extension of traditional video quality experiments to crowdsourcing settings is well-explored, the impact of crowdsourcing on QoE studies has not been thoroughly examined. This study explore the potential use of crowdsourcing platforms for acceptability & annoyance experiments. To this end, video quality and acceptability & annoyance experiments were conducted in both laboratory and crowdsourcing settings. The findings reveal a more linear relationship between video quality and QoE in crowdsourcing settings. Subjects in crowdsourcing settings tend to have higher expectations of video quality, resulting in a slight increase in acceptability & annoyance thresholds compared to laboratory experiments. Analyses suggest that extending acceptability & annoyance experiments to crowdsourcing is not as straightforward as extending traditional video quality experiments. In crowdsourcing settings, priming subject expectations with instructions is not as effective as it is in laboratory conditions.
Fichier principal
Vignette du fichier
ICIP_2024_AccAnn.pdf (7.47 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04615320 , version 1 (18-06-2024)

Identifiants

  • HAL Id : hal-04615320 , version 1

Citer

Ali Ak, Abhishek Gera, Denise Noyes, Hassene Tmar, Ioannis Katsavounidis, et al.. Comparison of crowdsourcing and laboratory settings for subjective assessment of video quality and acceptability & annoyance. 2024 IEEE International Conference on Image Processing (ICIP 2024), IEEE, Oct 2024, Abu Dhabi, United Arab Emirates. ⟨hal-04615320⟩
433 Consultations
109 Téléchargements

Partager

More