Performance Benchmark and Calibration
Performance Evaluation Protocol
We use official models for evaluation if available. Otherwise, we use the following settings to train and evaluate different models for simplicity and consistency:
Metric Type |
Train |
Test |
---|---|---|
FR |
KADID-10k |
CSIQ, LIVE, TID2008, TID2013 |
NR |
KonIQ-10k |
LIVEC, KonIQ-10k (official split), TID2013, SPAQ |
Aesthetic IQA |
AVA |
AVA (official split) |
Results are calculated with:
PLCC without any correction. Although test time value correction is common in IQA papers, we want to use the original value in our benchmark.
Full image single input. We use multi-patch testing only when it is necessary for the model to work.
Basically, we use the largest existing datasets for training, and cross dataset evaluation performance for fair comparison. The following models do not provide official weights, and are retrained by our scripts:
NR:
cnniqa
,dbcnn
,hyperiqa
Aesthetic IQA:
nima
,nima-vgg16-ava
Performance on FR benchmarks
Metric name |
csiq(PLCC/SRCC/KRCC) |
live(PLCC/SRCC/KRCC) |
tid2008(PLCC/SRCC/KRCC) |
tid2013(PLCC/SRCC/KRCC) |
---|---|---|---|---|
psnr |
0.7857/0.8087/0.5989 |
0.7633/0.8013/0.5964 |
0.489/0.5245/0.3696 |
0.6601/0.6869/0.4958 |
ssim |
0.765/0.8367/0.6323 |
0.7369/0.8509/0.6547 |
0.6003/0.6242/0.4521 |
0.6558/0.6269/0.455 |
ms_ssim |
0.7717/0.9125/0.7372 |
0.679/0.9027/0.7227 |
0.7894/0.8531/0.6555 |
0.7814/0.7852/0.6033 |
cw_ssim |
0.6078/0.7588/0.5562 |
0.5714/0.7681/0.5673 |
0.5965/0.6473/0.4625 |
0.5815/0.6533/0.4715 |
fsim |
0.8207/0.9309/0.7683 |
0.7747/0.9204/0.7515 |
0.8341/0.884/0.6991 |
0.8322/0.8509/0.6665 |
vif |
0.9219/0.9194/0.7532 |
0.9409/0.9526/0.8067 |
0.7769/0.7491/0.5861 |
0.7336/0.677/0.5148 |
lpips |
0.9005/0.9233/0.7499 |
0.7672/0.869/0.6768 |
0.711/0.7151/0.5221 |
0.7529/0.7445/0.5477 |
dists |
0.9324/0.9296/0.7644 |
0.8392/0.9051/0.7283 |
0.7032/0.6648/0.4861 |
0.7538/0.7077/0.5212 |
pieapp |
0.838/0.8968/0.7109 |
0.8577/0.9182/0.7491 |
0.6443/0.7971/0.6089 |
0.7195/0.8438/0.6571 |
ahiq |
0.8234/0.8273/0.6168 |
0.8039/0.8967/0.7066 |
0.6772/0.6807/0.4842 |
0.7379/0.7075/0.5127 |
wadiqam_fr |
0.9087/0.922/0.7461 |
0.9163/0.9308/0.7584 |
0.8221/0.8222/0.6245 |
0.8424/0.8264/0.628 |
topiq_fr |
0.9589/0.9674/0.8379 |
0.9542/0.9759/0.8617 |
0.9044/0.9226/0.7554 |
0.9158/0.9165/0.7441 |
Performance on NR benchmarks
Metric name |
livec(PLCC/SRCC/KRCC) |
koniq10k(PLCC/SRCC/KRCC) |
tid2013(PLCC/SRCC/KRCC) |
flive(PLCC/SRCC/KRCC) |
spaq(PLCC/SRCC/KRCC) |
koniq10k-1024(PLCC/SRCC/KRCC) |
---|---|---|---|---|---|---|
brisque |
0.3509/0.3128/0.2113 |
0.2107/0.23/0.1548 |
0.4317/0.3672/0.2574 |
0.5333/0.5259/0.366 |
||
niqe |
0.48/0.4505/0.3069 |
0.3155/0.3769/0.2555 |
0.3669/0.3121/0.2124 |
0.6692/0.6929/0.4928 |
||
ilniqe |
0.4938/0.4381/0.2972 |
0.5232/0.551/0.3862 |
0.5156/0.487/0.3442 |
0.6596/0.719/0.5137 |
||
nrqm |
0.4122/0.3012/0.2013 |
0.4756/0.3715/0.2517 |
0.4795/0.349/0.2413 |
0.6784/0.6507/0.45 |
||
pi |
0.5201/0.4615/0.3139 |
0.4688/0.4573/0.3132 |
0.4627/0.3479/0.2398 |
0.7353/0.7307/0.5227 |
||
nima |
0.4993/0.5071/0.348 |
0.7156/0.6662/0.4816 |
0.3324/0.321/0.2159 |
0.5153/0.5201/0.3558 |
||
paq2piq |
0.7542/0.7188/0.5302 |
0.7062/0.643/0.4622 |
0.5776/0.4011/0.2838 |
0.775/0.8289/0.6207 |
||
cnniqa |
0.6372/0.6089/0.4257 |
0.7934/0.7551/0.558 |
0.398/0.1769/0.117 |
0.7272/0.7397/0.5263 |
||
wadiqam_nr |
0.6631/0.6675/0.4721 |
0.83/0.8046/0.6129 |
0.3517/0.1544/0.1002 |
|||
dbcnn |
0.774/0.7562/0.5563 |
0.9197/0.9034/0.7338 |
0.5141/0.3855/0.2691 |
0.8549/0.8473/0.639 |
||
musiq-ava |
0.6001/0.5954/0.4235 |
0.589/0.5273/0.3714 |
||||
musiq-koniq |
0.8295/0.7889/0.5986 |
0.8958/0.8654/0.6817 |
0.6814/0.575/0.4131 |
0.5128/0.4978/0.3437 |
0.8626/0.8676/0.6649 |
|
musiq-paq2piq |
0.8014/0.7672/0.5743 |
0.7655/0.7084/0.5146 |
0.8112/0.8436/0.6412 |
|||
musiq-spaq |
0.8134/0.789/0.5937 |
0.7528/0.6799/0.4927 |
0.6039/0.5627/0.3941 |
|||
maniqa |
0.8262/0.8399/0.6491 |
0.9133/0.8934/0.7198 |
0.457/0.4515/0.318 |
0.4416/0.4489/0.3052 |
0.814/0.8166/0.6036 |
|
clipiqa |
0.6883/0.6955/0.5065 |
0.7211/0.6572/0.4736 |
0.6471/0.5786/0.4107 |
|||
clipiqa+ |
0.8312/0.8045/0.6109 |
0.8454/0.8026/0.6123 |
0.701/0.6318/0.4541 |
0.909/0.8954/0.7181 |
||
clipiqa+_rn50_512 |
0.8181/0.818/0.6231 |
0.9012/0.8847/0.7033 |
0.6577/0.5949/0.4241 |
|||
clipiqa+_vitL14_512 |
0.7679/0.7729/0.5733 |
0.8747/0.861/0.6721 |
0.6063/0.5259/0.3709 |
|||
tres-koniq |
0.8118/0.7771/0.5808 |
0.513/0.4919/0.3391 |
0.8624/0.8619/0.66 |
|||
tres-flive |
0.7213/0.7336/0.5373 |
0.7507/0.7068/0.516 |
0.6137/0.7269/0.533 |
|||
hyperiqa |
0.7779/0.7546/0.5562 |
0.9233/0.904/0.7336 |
0.5627/0.4537/0.3177 |
|||
topiq_nr |
0.8261/0.8106/0.6165 |
0.9436/0.9299/0.7727 |
0.5625/0.4452/0.3143 |
0.6289/0.5819/0.4105 |
0.8744/0.8704/0.6716 |
|
qalign |
0.8942/0.8814/0.6993 |
0.8529/0.8313/0.6368 |
0.9506/0.941/0.7924 |
Performance on image aesthetic benchmarks
Metric name |
ava(PLCC/SRCC/KRCC) |
---|---|
nima-vgg16-ava |
0.6624/0.657/0.4719 |
nima |
0.7172/0.7126/0.5213 |
clipiqa |
0.3576/0.3383/0.2301 |
laion_aes |
0.666/0.6653/0.4788 |
topiq_iaa_res50 |
0.737/0.7359/0.5423 |
topiq_iaa |
0.7902/0.791/0.5969 |
qalign |
0.8192/0.8223/0.6307 |
Results Calibration
Method |
I03.bmp |
I04.bmp |
I06.bmp |
I08.bmp |
I19.bmp |
---|---|---|---|---|---|
brisque |
94.6421 |
-0.1076 |
0.9929 |
5.3583 |
72.2617 |
brisque(ours) |
94.6452 |
-0.1082 |
1.0759 |
5.1423 |
66.8381 |
ckdn |
0.2833 |
0.5767 |
0.6367 |
0.658 |
0.5999 |
ckdn(ours) |
0.2841 |
0.5713 |
0.6257 |
0.6424 |
0.5956 |
cw_ssim |
0.2763 |
0.9996 |
1.0 |
0.9068 |
0.8658 |
cw_ssim(ours) |
0.2763 |
0.9996 |
1.0 |
0.9068 |
0.8658 |
dists |
0.4742 |
0.1424 |
0.0683 |
0.0287 |
0.3123 |
dists(ours) |
0.4742 |
0.1424 |
0.0682 |
0.0287 |
0.3123 |
entropy |
6.9511 |
6.9661 |
7.5309 |
7.5566 |
5.7629 |
entropy(ours) |
6.9511 |
6.9661 |
7.5309 |
7.5565 |
5.7629 |
fsim |
0.689 |
0.9702 |
0.9927 |
0.9575 |
0.822 |
fsim(ours) |
0.6891 |
0.9702 |
0.9927 |
0.9575 |
0.822 |
gmsd |
0.2203 |
0.0005 |
0.0004 |
0.1346 |
0.205 |
gmsd(ours) |
0.2203 |
0.0005 |
0.0004 |
0.1346 |
0.205 |
ilniqe |
113.4801 |
23.9968 |
19.975 |
22.4493 |
56.6721 |
ilniqe(ours) |
115.6107 |
24.0661 |
19.7494 |
22.3251 |
54.7701 |
laion_aes |
3.6420 |
5.5836 |
5.0716 |
4.6458 |
3.0889 |
laion_aes(ours) |
3.7204 |
5.5917 |
5.0756 |
4.6551 |
3.0973 |
lpips |
0.7237 |
0.2572 |
0.0508 |
0.052 |
0.4253 |
lpips(ours) |
0.7237 |
0.2572 |
0.0508 |
0.0521 |
0.4253 |
mad |
195.2796 |
80.8379 |
30.3918 |
84.3542 |
202.2371 |
mad(ours) |
195.2796 |
80.8379 |
30.3918 |
84.3542 |
202.237 |
ms_ssim |
0.6733 |
0.9996 |
0.9998 |
0.9566 |
0.8462 |
ms_ssim(ours) |
0.6707 |
0.9996 |
0.9998 |
0.9567 |
0.8418 |
musiq |
12.494 |
75.332 |
73.429 |
75.188 |
36.938 |
musiq(ours) |
12.4772 |
75.7764 |
73.7459 |
75.4604 |
38.0248 |
musiq-ava |
3.398 |
5.648 |
4.635 |
5.186 |
4.128 |
musiq-ava(ours) |
3.4084 |
5.6934 |
4.6968 |
5.1963 |
4.1955 |
musiq-koniq |
12.494 |
75.332 |
73.429 |
75.188 |
36.938 |
musiq-koniq(ours) |
12.4772 |
75.7764 |
73.7459 |
75.4604 |
38.0248 |
musiq-paq2piq |
46.035 |
72.66 |
73.625 |
74.361 |
69.006 |
musiq-paq2piq(ours) |
46.0187 |
72.6657 |
73.7655 |
74.388 |
69.7218 |
musiq-spaq |
17.685 |
70.492 |
78.74 |
79.015 |
49.105 |
musiq-spaq(ours) |
17.6804 |
70.6531 |
79.0364 |
79.3189 |
50.4526 |
niqe |
15.7536 |
3.6549 |
3.2355 |
3.184 |
8.6352 |
niqe(ours) |
15.6538 |
3.6549 |
3.2342 |
3.1921 |
9.0722 |
nlpd |
0.5616 |
0.0195 |
0.0159 |
0.3028 |
0.4326 |
nlpd(ours) |
0.5616 |
0.0139 |
0.011 |
0.3033 |
0.4335 |
nrqm |
1.3894 |
8.9394 |
8.9735 |
6.829 |
6.312 |
nrqm(ours) |
1.3932 |
8.9419 |
8.9721 |
6.8309 |
6.3031 |
paq2piq |
44.134 |
73.6015 |
74.3297 |
76.8748 |
70.9153 |
paq2piq(ours) |
44.1341 |
73.6015 |
74.3297 |
76.8748 |
70.9153 |
pi |
11.9235 |
3.072 |
2.618 |
2.8074 |
6.7713 |
pi(ours) |
11.9286 |
3.073 |
2.6357 |
2.7979 |
6.9546 |
psnr |
21.11 |
20.99 |
27.01 |
23.3 |
21.62 |
psnr(ours) |
21.1136 |
20.9872 |
27.0139 |
23.3002 |
21.6186 |
ssim |
0.6993 |
0.9978 |
0.9989 |
0.9669 |
0.6519 |
ssim(ours) |
0.6997 |
0.9978 |
0.9989 |
0.9671 |
0.6522 |
vif |
0.0172 |
0.9891 |
0.9924 |
0.9103 |
0.1745 |
vif(ours) |
0.0172 |
0.9891 |
0.9924 |
0.9104 |
0.175 |
vsi |
0.9139 |
0.962 |
0.9922 |
0.9571 |
0.9262 |
vsi(ours) |
0.9244 |
0.9497 |
0.9877 |
0.9541 |
0.9348 |