This repository contains the source code to reproduce the paper Feature-based No-Reference Video Quality Assessment using Extra Trees.
The source code to extract features from video files are available at ./feature_extraction
folder. Please check the instruction in ./feature_extraction/ReadMe.md
to extract features from video files.
As mentioned in that instruction, after running the files, you will eventually have the following folders containing all extracted features:
CORNIA_features
HOSA_features
Laplace_and_sobel
csv_files (final)
Note: In some of the functions (e.g., ./feature_extraction/Other_features/feature_extraction.py
), matlab engine is used. To use the MATLAB engine you need to
first install the MATLAB(2017b or latter) on Linux. Next, you should install its
engine for python. To do so, fisrt off, find the path to the MATLAB folder. Start MATLAB
and type matlabroot in the command window. Copy the path returned by matlabroot.
Then, on Mac or Linux systems run:
$ cd "matlabroot/extern/engines/python"
$ sudo python setup.py install
To merge the features please check the instruction in ./merge_features
. After running the code, you will have two files:
tags.npy
: which indicates each of the feature belongs to which group of feature as reported in the paper.all_selected_features.npy
: which includes all 611 features and the ground-truth MOS.
Experiments are completely described in ./Experiments
, including:
- Different Regressors
- [Extra Trees] All Combination of Feature
- [Extra Trees] performance of model with up to two subsets of features
- [Extra Trees] frequency of appearance in top models
- [Extra Trees] MDI Feature Importance
If you found this repository helpful, please don't forget to cite our paper:
@article{otroshi2022feature,
title={Feature-based no-reference video quality assessment using Extra Trees},
author={Otroshi-Shahreza, Hatef and Amini, Arash and Behroozi, Hamid},
journal={IET Image Processing},
volume={16},
number={6},
pages={1531--1543},
year={2022}
}
In case of any question, please feel free to contact Hatef Otroshi ([email protected]).