The rise of cloud gaming on handheld devices has created a pressing need for reliable tools to measure perceptual video quality. To meet this demand, the LIVE-Meta Mobile Cloud Gaming (LIVE-Meta MCG) database was produced as a purposely designed corpus for researchers. This resource bundles hundreds of gameplay clips in both portrait and landscape formats and pairs them with extensive human judgments gathered in a controlled environment. The collection aims to support the creation and validation of objective video quality assessment models tuned for the unique distortions and viewing conditions of mobile cloud gaming.
What the database contains
The database includes 600 processed sequences derived from 30 high-quality source videos. Each source was converted using 20 different resolution–bitrate combinations, spanning common mobile delivery points. Encoded sequences cover spatial resolutions from 360p to 720p and bitrates ranging from 250 kbps to 2 Mbps, reflecting real-world streaming constraints. The collection contains a balanced mix of portrait and landscape clips to reflect modern handheld usage patterns. Importantly, the original content is labeled as pristine and the resulting compressed variants document the perceptual consequences of encoding choices common to cloud gaming.
Subjective study and ratings
To capture human impressions, an in-lab subjective quality study was conducted in which 72 participants provided a total of 14,400 independent quality ratings. The procedure was designed to measure perceived video quality across the diverse stimulus pool and to yield reliable summary scores for each clip. These aggregated scores enable researchers to correlate objective metrics with human perception and to evaluate new algorithms using standardized ground truth. The dataset therefore includes not only the media files but also the associated subjective data necessary for reproducible research.
Structure of the stimulus set
The 30 original game captures represent a variety of visual content and motion characteristics typical of contemporary titles. For each original capture, 20 encoding profiles were applied to generate the 600 unique sequences. This systematic approach permits examination of how combinations of resolution and bitrate affect perceived quality, and it facilitates training and testing of both full-reference and no-reference (NR-VQA) approaches.
Benchmarks, accessibility, and how to use the data
To illustrate the dataset’s utility, several state-of-the-art VQA algorithms were benchmarked on the LIVE-Meta MCG collection, emphasizing the evaluation of existing NR-VQA methods on mobile-oriented distortions. The dataset and benchmark results help identify gaps where current models falter and guide the development of more robust predictors. The full dataset is made available free to the research community; access requires completing a short form. Files are currently hosted and distributed via Globus, and a free account using a Gmail or GitHub identity is needed to download the data.
How to cite and download
If you use the database, please cite the associated publications and website. The primary reference is A. Saha et al., “Study of Subjective and Objective Quality Assessment of Mobile Cloud Gaming Videos”, IEEE Transactions on Image Processing 2026, and the dataset page hosted by the LIVE lab at the University of Texas at Austin (online). The project page contains the dataset link, access instructions, and a form required to obtain the download credentials. These citations ensure reproducibility and proper attribution for future work.
Team and licensing
The dataset was created by a collaboration between researchers and engineers from academic and industry labs. Key contributors include Avinab Saha and Yu-Chih Chen (Dept. of ECE, UT Austin), Chase Davis, Bo Qiu, Xiaoming Wang, and Rahul Gowda (Meta Platforms Inc.), Ioannis Katsavounidis (Meta Platforms Inc.), and Alan C. Bovik (Dept. of ECE, UT Austin). The resource is distributed under the terms provided by the University of Texas at Austin, which grant permission to use, copy, and modify the files for research purposes while requiring acknowledgement and citation of the original source. Users should consult the provided copyright statement on the dataset page for full legal details.

