This table lists the benchmark results for the low-res two-view scenario. This benchmark evaluates the Middlebury stereo metrics (for all metrics, smaller is better):

The mask determines whether the metric is evaluated for all pixels with ground truth, or only for pixels which are visible in both images (non-occluded).
The coverage selector allows to limit the table to results for all pixels (dense), or a given minimum fraction of pixels.

Methods with suffix _ROB may participate in the Robust Vision Challenge.

Click one or more dataset result cells or column headers to show visualizations. Most visualizations are only available for training datasets. The visualizations may not work with mobile browsers.




Method Infoalllakes. 1llakes. 1ssand box 1lsand box 1sstora. room 1lstora. room 1sstora. room 2lstora. room 2sstora. room 2 1lstora. room 2 1sstora. room 2 2lstora. room 2 2sstora. room 3lstora. room 3stunnel 1ltunnel 1stunnel 2ltunnel 2stunnel 3ltunnel 3s
sorted bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort by
PMTNettwo views0.34
1
0.15
1
0.30
1
0.31
1
0.20
4
0.44
3
0.42
29
0.61
1
0.91
23
0.47
1
0.55
1
0.44
1
0.32
1
0.65
7
0.33
3
0.16
9
0.11
3
0.13
2
0.09
1
0.13
1
0.11
1
R-Stereotwo views0.39
2
0.18
2
0.42
11
0.42
20
0.26
18
0.40
1
0.32
6
0.72
2
0.34
1
0.89
29
1.28
61
0.57
4
0.35
3
0.42
1
0.32
1
0.15
3
0.12
5
0.14
5
0.12
6
0.14
2
0.13
4
DPM-Stereotwo views0.39
2
0.21
4
0.50
21
0.41
17
0.28
24
0.62
15
0.27
1
0.77
4
0.40
3
0.64
3
0.86
7
0.52
3
0.33
2
0.66
8
0.42
6
0.15
3
0.13
13
0.17
15
0.14
11
0.15
5
0.15
7
R-Stereo Traintwo views0.39
2
0.18
2
0.42
11
0.42
20
0.26
18
0.40
1
0.32
6
0.72
2
0.34
1
0.89
29
1.28
61
0.57
4
0.35
3
0.42
1
0.32
1
0.15
3
0.12
5
0.14
5
0.12
6
0.14
2
0.13
4
HITNettwo views0.41
5
0.26
14
0.45
15
0.33
2
0.17
1
0.63
19
0.30
5
0.93
10
0.64
4
0.71
10
0.82
2
0.64
7
0.52
9
0.59
4
0.43
7
0.14
2
0.10
1
0.15
9
0.13
9
0.19
16
0.12
2
Vladimir Tankovich, Christian Häne, Yinda Zhang, Adarsh Kowdle, Sean Fanello, Sofien Bouaziz: HITNet: Hierarchical Iterative Tile Refinement Network for Real-time Stereo Matching. CVPR 2021
DN-CSS_ROBtwo views0.43
6
0.26
14
0.62
33
0.40
14
0.29
26
0.44
3
0.27
1
0.77
4
0.75
9
0.72
11
1.09
39
0.48
2
0.43
5
0.69
15
0.37
4
0.15
3
0.12
5
0.21
19
0.19
20
0.18
12
0.16
11
BEATNet_4xtwo views0.46
7
0.32
28
0.56
29
0.37
5
0.19
3
0.63
19
0.35
14
1.01
20
0.72
7
0.77
15
0.83
4
0.70
9
0.59
19
0.68
12
0.53
23
0.16
9
0.12
5
0.16
12
0.14
11
0.24
27
0.16
11
MLCVtwo views0.47
8
0.24
10
0.61
32
0.37
5
0.17
1
0.61
11
0.33
10
0.80
6
0.87
14
0.94
39
1.13
46
0.72
10
0.43
5
0.89
37
0.51
18
0.13
1
0.10
1
0.13
2
0.11
4
0.14
2
0.12
2
CFNet_RVCtwo views0.48
9
0.23
8
0.36
3
0.40
14
0.26
18
0.64
21
0.37
16
0.87
9
0.85
12
0.64
3
1.00
25
1.01
37
0.64
26
0.61
5
0.53
23
0.23
32
0.15
21
0.24
31
0.21
24
0.23
24
0.18
18
ccstwo views0.49
10
0.23
8
0.42
11
0.35
3
0.25
14
0.44
3
0.34
12
1.28
69
1.21
64
0.60
2
0.82
2
0.75
12
0.55
11
0.56
3
0.47
13
0.26
43
0.17
29
0.26
37
0.24
41
0.26
34
0.26
43
iResNettwo views0.49
10
0.29
21
0.79
62
0.42
20
0.21
5
0.61
11
0.40
22
0.82
7
0.99
36
0.87
26
1.12
44
0.69
8
0.53
10
0.78
24
0.51
18
0.15
3
0.11
3
0.14
5
0.12
6
0.17
9
0.15
7
AdaStereotwo views0.50
12
0.32
28
0.47
18
0.48
41
0.25
14
0.67
26
0.32
6
1.17
49
0.88
18
0.75
14
0.99
24
0.75
12
0.49
8
0.67
11
0.46
11
0.27
48
0.12
5
0.28
49
0.18
19
0.23
24
0.18
18
Xiao Song, Guorun Yang, Xinge Zhu, Hui Zhou, Zhe Wang, Jianping Shi: AdaStereo: A Simple and Efficient Approach for Adaptive Stereo Matching. IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2021.
ccs_robtwo views0.50
12
0.24
10
0.40
7
0.36
4
0.22
8
0.76
35
0.33
10
1.01
20
0.96
33
0.81
18
1.05
32
0.94
29
0.57
14
0.70
16
0.44
9
0.20
25
0.15
21
0.22
20
0.23
34
0.18
12
0.17
15
HSMtwo views0.51
14
0.24
10
0.41
8
0.39
11
0.24
13
0.75
34
0.40
22
1.22
59
0.75
9
0.81
18
0.91
16
1.21
53
0.69
34
0.87
32
0.46
11
0.17
13
0.13
13
0.17
15
0.14
11
0.17
9
0.17
15
CFNettwo views0.51
14
0.29
21
0.46
16
0.41
17
0.27
21
0.70
29
0.32
6
1.13
41
1.00
39
0.72
11
1.07
36
0.85
21
0.60
20
0.76
20
0.52
21
0.19
20
0.14
18
0.23
26
0.23
34
0.21
19
0.16
11
iResNetv2_ROBtwo views0.52
16
0.27
17
0.69
48
0.38
9
0.25
14
0.62
15
0.35
14
1.17
49
1.01
41
0.99
45
1.06
35
0.88
27
0.67
30
0.66
8
0.39
5
0.19
20
0.13
13
0.15
9
0.11
4
0.18
12
0.16
11
iResNet_ROBtwo views0.52
16
0.21
4
0.53
24
0.37
5
0.21
5
0.61
11
0.29
4
1.51
90
1.32
76
0.90
32
1.02
27
0.80
16
0.62
23
0.76
20
0.43
7
0.16
9
0.12
5
0.13
2
0.09
1
0.15
5
0.18
18
DMCAtwo views0.53
18
0.30
24
0.50
21
0.51
53
0.34
44
0.55
7
0.41
26
0.85
8
0.66
5
0.89
29
0.96
20
0.95
30
0.82
50
0.63
6
0.58
25
0.26
43
0.22
45
0.34
71
0.27
54
0.27
41
0.20
26
NLCA_NET_v2_RVCtwo views0.53
18
0.32
28
0.64
36
0.44
28
0.30
28
0.66
24
0.44
32
1.04
30
0.88
18
0.91
36
0.91
16
0.99
36
0.60
20
0.68
12
0.60
27
0.21
27
0.17
29
0.22
20
0.22
29
0.21
19
0.21
28
Zhibo Rao, Mingyi He, Yuchao Dai, Zhidong Zhu, Bo Li, and Renjie He.: NLCA-Net: A non-local context attention network for stereo matching.
CC-Net-ROBtwo views0.53
18
0.32
28
0.64
36
0.44
28
0.30
28
0.65
22
0.44
32
1.02
25
0.88
18
0.92
37
0.92
18
0.98
34
0.61
22
0.68
12
0.61
28
0.21
27
0.17
29
0.22
20
0.21
24
0.21
19
0.21
28
HSM-Net_RVCpermissivetwo views0.53
18
0.22
6
0.35
2
0.37
5
0.21
5
0.85
45
0.37
16
1.23
62
0.93
26
0.90
32
1.15
48
1.26
57
0.63
24
0.72
18
0.48
15
0.17
13
0.13
13
0.16
12
0.15
14
0.17
9
0.15
7
Gengshan Yang, Joshua Manela, Michael Happold, and Deva Ramanan: Hierarchical Deep Stereo Matching on High-resolution Images. CVPR 2019
RASNettwo views0.53
18
0.22
6
0.52
23
0.38
9
0.22
8
0.68
28
0.38
19
0.97
14
0.88
18
0.73
13
1.04
30
1.03
39
0.92
61
0.90
38
0.86
48
0.16
9
0.14
18
0.14
5
0.17
18
0.15
5
0.13
4
DeepPruner_ROBtwo views0.53
18
0.34
41
0.59
30
0.41
17
0.30
28
0.50
6
0.43
30
1.19
52
0.70
6
0.85
23
1.05
32
0.86
22
0.55
11
0.84
29
0.51
18
0.29
55
0.23
52
0.22
20
0.22
29
0.26
34
0.25
39
DMCA-RVCcopylefttwo views0.54
24
0.30
24
0.59
30
0.46
37
0.29
26
0.62
15
0.41
26
0.95
11
0.72
7
0.88
27
0.98
23
1.06
42
0.66
28
0.66
8
0.67
34
0.31
62
0.23
52
0.32
64
0.28
57
0.25
30
0.20
26
STTStereotwo views0.55
25
0.37
48
0.76
57
0.44
28
0.31
36
0.60
10
0.44
32
0.96
12
0.86
13
0.99
45
0.90
14
0.96
31
0.58
16
0.71
17
0.61
28
0.25
37
0.23
52
0.26
37
0.32
73
0.24
27
0.23
35
FADNet_RVCtwo views0.55
25
0.37
48
1.17
81
0.44
28
0.31
36
0.56
8
0.34
12
1.06
33
0.88
18
0.67
6
0.90
14
0.61
6
0.65
27
0.93
41
0.70
38
0.21
27
0.18
34
0.26
37
0.23
34
0.31
52
0.28
48
FADNet-RVC-Resampletwo views0.56
27
0.33
37
1.16
80
0.49
44
0.34
44
0.71
30
0.43
30
1.01
20
0.93
26
0.82
21
0.87
9
0.74
11
0.67
30
0.73
19
0.45
10
0.18
17
0.18
34
0.26
37
0.27
54
0.26
34
0.27
46
TDLMtwo views0.57
28
0.33
37
0.49
20
0.51
53
0.30
28
0.66
24
0.73
72
1.03
27
1.01
41
0.82
21
1.10
41
0.84
20
0.70
38
0.86
31
0.58
25
0.28
51
0.17
29
0.25
33
0.22
29
0.26
34
0.19
23
CVANet_RVCtwo views0.57
28
0.33
37
0.44
14
0.47
40
0.32
40
0.67
26
0.61
56
1.05
31
0.93
26
0.85
23
1.04
30
1.01
37
0.73
41
0.83
28
0.52
21
0.29
55
0.19
38
0.27
45
0.25
45
0.31
52
0.21
28
AANet_RVCtwo views0.57
28
0.32
28
0.54
25
0.45
35
0.27
21
0.57
9
0.48
37
1.00
17
1.28
73
0.86
25
1.33
63
0.87
23
0.58
16
1.07
54
0.75
40
0.18
17
0.12
5
0.15
9
0.13
9
0.18
12
0.18
18
NOSS_ROBtwo views0.57
28
0.39
57
0.39
5
0.44
28
0.30
28
0.73
32
0.54
48
1.03
27
1.08
53
0.67
6
0.87
9
0.76
15
0.57
14
0.77
22
0.47
13
0.38
90
0.38
94
0.44
91
0.42
92
0.37
72
0.34
68
DLCB_ROBtwo views0.58
32
0.28
19
0.47
18
0.49
44
0.32
40
0.77
36
0.50
41
0.99
16
0.92
24
1.04
50
1.14
47
1.21
53
0.69
34
0.88
34
0.69
36
0.20
25
0.20
42
0.22
20
0.22
29
0.21
19
0.18
18
StereoDRNet-Refinedtwo views0.58
32
0.32
28
0.54
25
0.46
37
0.30
28
0.96
53
0.40
22
1.00
17
0.94
30
1.12
59
1.08
37
0.96
31
0.67
30
0.90
38
0.89
59
0.17
13
0.13
13
0.22
20
0.21
24
0.20
18
0.19
23
Rohan Chabra, Julian Straub, Chris Sweeney, Richard Newcombe, Henry Fuchs: StereoDRNet. CVPR
CBMV_ROBtwo views0.59
34
0.37
48
0.38
4
0.39
11
0.28
24
0.74
33
0.28
3
1.00
17
0.98
34
0.94
39
1.25
59
0.81
17
0.78
44
0.87
32
0.50
16
0.36
86
0.37
92
0.43
89
0.40
89
0.33
60
0.30
56
NVstereo2Dtwo views0.60
35
0.30
24
0.75
56
0.48
41
0.43
64
0.89
46
0.48
37
1.05
31
1.15
59
0.66
5
0.86
7
0.88
27
0.70
38
0.99
45
0.65
32
0.32
72
0.15
21
0.28
49
0.20
21
0.40
78
0.36
75
FADNet-RVCtwo views0.60
35
0.53
87
1.14
79
0.46
37
0.39
50
0.61
11
0.39
20
1.12
40
0.87
14
0.70
8
0.84
5
1.04
40
0.55
11
0.94
44
0.69
36
0.26
43
0.25
60
0.28
49
0.28
57
0.31
52
0.31
58
FADNettwo views0.61
37
0.51
81
1.10
77
0.45
35
0.44
68
0.65
22
0.41
26
1.19
52
1.05
45
0.70
8
0.84
5
0.98
34
0.66
28
0.82
25
0.50
16
0.29
55
0.30
74
0.27
45
0.30
63
0.46
91
0.35
73
SGM-Foresttwo views0.62
38
0.32
28
0.39
5
0.43
25
0.30
28
1.02
61
0.55
49
1.09
36
1.05
45
1.06
53
1.18
53
1.10
47
0.67
30
0.88
34
0.62
30
0.31
62
0.33
87
0.33
69
0.30
63
0.30
48
0.28
48
Johannes L. Schönberger, Sudipta Sinha, Marc Pollefeys: Learning to Fuse Proposals from Multiple Scanline Optimizations in Semi-Global Matching. ECCV 2018
PSMNet_ROBtwo views0.64
39
0.37
48
0.65
40
0.56
67
0.42
61
0.97
54
0.83
78
1.21
57
0.92
24
0.79
16
1.02
27
1.08
45
0.63
24
0.82
25
0.91
62
0.29
55
0.20
42
0.31
60
0.31
69
0.26
34
0.23
35
StereoDRNettwo views0.64
39
0.33
37
0.68
44
0.51
53
0.43
64
1.03
63
0.53
46
1.47
86
0.99
36
1.10
57
0.87
9
1.07
44
0.71
40
0.82
25
0.88
52
0.23
32
0.19
38
0.25
33
0.23
34
0.27
41
0.19
23
CBMVpermissivetwo views0.64
39
0.35
42
0.41
8
0.42
20
0.25
14
1.01
59
0.77
76
1.13
41
1.09
54
1.17
63
1.22
56
1.04
40
0.75
42
0.88
34
0.63
31
0.28
51
0.30
74
0.34
71
0.30
63
0.26
34
0.26
43
Konstantinos Batsos, Changjiang Cai, Philippos Mordohai: CBMV: A Coalesced Bidirectional Matching Volume for Disparity Estimation. Computer Vision and Pattern Recognition (CVPR) 2018
DRN-Testtwo views0.66
42
0.32
28
0.66
42
0.50
48
0.39
50
1.15
72
0.61
56
1.49
87
1.01
41
1.12
59
1.08
37
1.06
42
0.69
34
0.77
22
0.94
63
0.23
32
0.17
29
0.26
37
0.25
45
0.25
30
0.21
28
PA-Nettwo views0.67
43
0.43
64
0.83
64
0.50
48
0.53
83
0.94
50
0.68
64
1.10
37
1.14
58
0.80
17
0.88
13
0.87
23
0.77
43
1.02
48
0.88
52
0.24
35
0.41
99
0.29
53
0.44
93
0.26
34
0.34
68
Zhibo Rao, Mingyi He, Yuchao Dai, Zhelun Shen: Patch Attention Network with Generative Adversarial Model for Semi-Supervised Binocular Disparity Prediction.
NaN_ROBtwo views0.69
44
0.44
68
0.69
48
0.51
53
0.33
43
0.97
54
1.06
94
1.22
59
1.19
61
1.24
70
0.96
20
1.08
45
0.78
44
1.20
65
0.67
34
0.24
35
0.30
74
0.23
26
0.24
41
0.23
24
0.25
39
GANettwo views0.70
45
0.37
48
0.55
28
0.51
53
0.34
44
0.94
50
0.96
88
1.08
35
0.94
30
1.04
50
1.98
89
1.13
51
0.88
57
1.12
59
0.66
33
0.25
37
0.26
64
0.27
45
0.22
29
0.31
52
0.22
34
DISCOtwo views0.70
45
0.26
14
0.68
44
0.44
28
0.31
36
1.14
70
0.51
43
1.23
62
1.23
67
0.81
18
1.05
32
1.50
83
0.85
53
1.82
94
1.03
72
0.19
20
0.15
21
0.18
18
0.16
16
0.24
27
0.23
35
ETE_ROBtwo views0.70
45
0.49
75
0.73
53
0.56
67
0.36
48
0.79
38
0.87
82
1.14
44
0.93
26
1.19
65
1.18
53
1.39
73
0.82
50
0.99
45
0.80
43
0.26
43
0.21
44
0.31
60
0.26
53
0.33
60
0.34
68
XPNet_ROBtwo views0.70
45
0.35
42
0.68
44
0.52
58
0.40
53
0.79
38
0.85
81
1.16
47
0.94
30
1.33
75
1.15
48
1.38
72
0.83
52
1.03
49
0.90
60
0.32
72
0.25
60
0.26
37
0.24
41
0.31
52
0.28
48
DANettwo views0.71
49
0.45
69
0.91
68
0.59
71
0.40
53
0.81
42
0.40
22
0.97
14
0.84
11
1.20
67
1.22
56
1.33
66
0.88
57
1.16
62
1.24
81
0.27
48
0.23
52
0.31
60
0.25
45
0.35
69
0.32
61
NCCL2two views0.72
50
0.41
61
0.65
40
0.64
82
0.44
68
0.98
57
1.20
97
1.11
38
1.00
39
0.98
44
0.96
20
1.49
82
0.87
56
0.90
38
0.79
42
0.31
62
0.26
64
0.39
85
0.39
87
0.32
57
0.32
61
RYNettwo views0.72
50
0.35
42
0.64
36
0.53
61
0.67
99
1.31
80
0.67
61
1.20
55
1.18
60
1.05
52
0.87
9
1.27
58
0.93
64
0.93
41
1.27
84
0.22
31
0.16
26
0.26
37
0.23
34
0.40
78
0.39
84
LALA_ROBtwo views0.74
52
0.43
64
0.69
48
0.54
64
0.40
53
1.01
59
0.95
87
1.21
57
1.07
51
1.22
68
1.12
44
1.56
87
0.79
46
1.04
50
0.86
48
0.33
76
0.22
45
0.35
74
0.30
63
0.35
69
0.31
58
Anonymous Stereotwo views0.75
53
0.54
88
1.78
96
0.57
69
0.56
90
0.62
15
1.25
99
0.96
12
0.99
36
0.96
41
0.95
19
0.83
19
0.58
16
1.27
70
1.21
80
0.31
62
0.30
74
0.31
60
0.33
76
0.33
60
0.36
75
edge stereotwo views0.76
54
0.38
54
0.74
55
0.52
58
0.43
64
0.80
40
0.53
46
1.28
69
1.06
50
1.28
71
1.38
67
1.81
90
1.11
74
1.08
55
0.85
47
0.33
76
0.32
83
0.41
88
0.28
57
0.32
57
0.36
75
RPtwo views0.76
54
0.36
46
0.63
35
0.66
90
0.58
96
0.80
40
0.55
49
1.14
44
1.22
66
0.97
43
1.65
79
1.43
77
1.32
86
1.01
47
0.84
45
0.33
76
0.30
74
0.36
78
0.30
63
0.34
65
0.34
68
Nwc_Nettwo views0.78
56
0.38
54
0.73
53
0.60
76
0.55
87
1.05
65
0.55
49
1.67
100
1.05
45
0.88
27
1.86
86
1.40
75
1.09
72
1.10
57
0.87
51
0.30
60
0.25
60
0.36
78
0.38
84
0.29
44
0.29
53
Abc-Nettwo views0.78
56
0.40
59
0.76
57
0.61
78
0.67
99
0.81
42
0.69
66
1.54
93
1.11
55
0.90
32
1.36
64
1.33
66
1.16
76
1.19
63
0.88
52
0.31
62
0.30
74
0.37
81
0.48
97
0.33
60
0.33
65
NCC-stereotwo views0.78
56
0.40
59
0.76
57
0.61
78
0.67
99
0.81
42
0.69
66
1.54
93
1.11
55
0.90
32
1.36
64
1.33
66
1.16
76
1.19
63
0.88
52
0.31
62
0.30
74
0.37
81
0.48
97
0.33
60
0.33
65
RGCtwo views0.79
59
0.59
90
0.71
51
0.65
87
0.65
98
0.92
49
0.58
53
1.29
71
1.05
45
1.07
55
1.45
73
1.47
80
1.03
69
1.21
67
0.88
52
0.32
72
0.30
74
0.48
98
0.40
89
0.37
72
0.37
78
GANetREF_RVCpermissivetwo views0.80
60
0.75
100
0.86
67
0.66
90
0.42
61
1.15
72
0.98
89
1.14
44
1.19
61
1.01
48
1.02
27
1.10
47
0.90
59
1.15
61
0.76
41
0.48
100
0.39
96
0.57
105
0.38
84
0.63
104
0.46
95
Zhang, Feihu and Prisacariu, Victor and Yang, Ruigang and Torr, Philip HS: GA-Net: Guided Aggregation Net for End- to-end Stereo Matching. CVPR 2019
ADCReftwo views0.80
60
0.43
64
1.32
90
0.50
48
0.45
72
1.07
66
0.51
43
1.07
34
0.98
34
1.17
63
1.42
68
0.87
23
0.91
60
0.85
30
3.19
99
0.18
17
0.16
26
0.23
26
0.23
34
0.22
23
0.21
28
stereogantwo views0.80
60
0.31
27
0.71
51
0.59
71
0.50
77
1.72
96
0.48
37
1.35
76
1.37
79
1.00
47
1.46
74
1.56
87
0.95
65
1.29
72
0.84
45
0.28
51
0.29
71
0.29
53
0.25
45
0.38
76
0.38
83
AF-Nettwo views0.81
63
0.39
57
0.68
44
0.64
82
0.54
85
0.91
47
0.51
43
1.61
97
1.31
75
0.96
41
2.02
90
1.46
78
1.26
85
1.11
58
0.99
69
0.32
72
0.24
56
0.39
85
0.27
54
0.30
48
0.25
39
CSANtwo views0.83
64
0.49
75
0.77
61
0.60
76
0.41
58
1.14
70
1.06
94
1.19
52
1.46
87
1.32
73
1.43
71
1.37
71
1.19
79
1.27
70
0.86
48
0.39
92
0.31
82
0.36
78
0.37
81
0.35
69
0.34
68
PWC_ROBbinarytwo views0.87
65
0.52
84
1.28
87
0.50
48
0.32
40
0.91
47
0.37
16
1.36
77
1.46
87
1.61
84
2.59
99
1.16
52
1.01
68
1.34
75
1.40
89
0.25
37
0.18
34
0.28
49
0.20
21
0.29
44
0.28
48
RTSCtwo views0.87
65
0.48
72
1.23
85
0.52
58
0.37
49
1.13
69
0.50
41
1.34
74
1.79
96
1.51
79
1.24
58
1.12
50
0.81
48
1.62
88
2.16
97
0.25
37
0.19
38
0.23
26
0.24
41
0.30
48
0.31
58
FBW_ROBtwo views0.88
67
0.50
79
0.85
65
0.55
66
0.40
53
1.17
74
0.78
77
1.50
88
1.37
79
1.33
75
1.27
60
1.48
81
1.00
67
2.57
106
0.98
68
0.25
37
0.32
83
0.45
92
0.30
63
0.34
65
0.28
48
XQCtwo views0.89
68
0.65
97
1.31
89
0.67
93
0.52
82
1.22
75
0.68
64
1.26
68
1.52
89
1.14
61
1.11
42
1.28
59
1.04
70
1.40
81
1.77
93
0.38
90
0.24
56
0.37
81
0.31
69
0.53
100
0.47
97
PWCDC_ROBbinarytwo views0.89
68
0.49
75
0.80
63
0.74
102
0.40
53
1.03
63
0.39
20
1.33
73
2.35
105
1.02
49
3.77
105
0.81
17
0.92
61
1.20
65
0.90
60
0.34
82
0.22
45
0.24
31
0.20
21
0.34
65
0.29
53
DeepPrunerFtwo views0.89
68
0.52
84
3.43
106
0.68
96
0.81
103
0.72
31
0.72
71
1.22
59
1.99
99
0.93
38
1.01
26
0.87
23
0.86
54
1.08
55
0.95
64
0.34
82
0.28
68
0.43
89
0.39
87
0.30
48
0.32
61
PASMtwo views0.90
71
0.64
96
1.69
94
0.69
98
0.69
102
0.78
37
0.84
79
1.03
27
1.26
72
1.19
65
1.15
48
1.31
62
0.98
66
1.24
68
1.03
72
0.52
104
0.56
107
0.59
107
0.66
106
0.56
103
0.54
104
PDISCO_ROBtwo views0.91
72
0.48
72
1.06
74
0.99
107
0.97
109
1.67
95
0.63
58
1.64
98
1.59
93
1.06
53
1.18
53
1.31
62
0.79
46
1.55
83
0.97
67
0.49
102
0.22
45
0.45
92
0.40
89
0.45
88
0.37
78
MDST_ROBtwo views0.91
72
0.27
17
0.85
65
0.63
81
0.41
58
2.53
103
0.71
70
1.60
96
1.13
57
2.83
104
1.73
82
0.97
33
0.69
34
1.61
87
0.73
39
0.25
37
0.22
45
0.32
64
0.28
57
0.25
30
0.24
38
SHDtwo views0.93
74
0.50
79
1.18
82
0.59
71
0.46
73
0.97
54
0.48
37
1.70
101
2.15
101
1.58
81
1.37
66
1.42
76
1.19
79
1.31
73
1.60
90
0.31
62
0.26
64
0.32
64
0.33
76
0.40
78
0.42
87
STTStereo_v2two views0.95
75
0.46
70
1.04
72
0.64
82
0.48
74
2.12
101
0.64
59
1.01
20
0.87
14
1.75
90
2.82
101
1.30
60
1.64
97
1.06
51
0.96
65
0.36
86
0.29
71
0.35
74
0.25
45
0.45
88
0.42
87
G-Nettwo views0.95
75
0.46
70
1.04
72
0.64
82
0.48
74
2.12
101
0.64
59
1.01
20
0.87
14
1.75
90
2.82
101
1.30
60
1.64
97
1.06
51
0.96
65
0.36
86
0.29
71
0.35
74
0.25
45
0.45
88
0.42
87
SGM_RVCbinarytwo views0.97
77
0.28
19
0.41
8
0.39
11
0.22
8
1.55
88
0.73
72
1.51
90
1.25
71
2.25
100
1.92
87
2.58
102
1.56
95
2.02
101
1.19
79
0.27
48
0.27
67
0.27
45
0.25
45
0.25
30
0.25
39
Heiko Hirschmueller: Stereo processing by semiglobal matching and mutual information. TPAMI 2008, Volume 30(2), pp. 328-341
DPSNettwo views0.99
78
0.42
63
1.88
97
0.53
61
0.41
58
1.59
91
1.26
100
2.21
109
1.43
83
1.10
57
1.11
42
1.46
78
1.95
104
1.34
75
1.26
82
0.34
82
0.25
60
0.25
33
0.21
24
0.46
91
0.37
78
MFN_U_SF_DS_RVCtwo views1.02
79
0.57
89
1.51
92
0.67
93
0.50
77
2.57
104
1.66
108
1.20
55
1.43
83
1.38
77
1.57
78
1.33
66
0.92
61
1.64
91
0.88
52
0.37
89
0.39
96
0.46
95
0.60
105
0.41
83
0.40
85
ADCPNettwo views1.03
80
0.48
72
2.70
101
0.54
64
0.42
61
1.40
81
0.75
74
1.13
41
1.07
51
1.22
68
1.42
68
1.68
89
1.09
72
1.36
78
3.45
100
0.30
60
0.28
68
0.29
53
0.32
73
0.37
72
0.35
73
ADCLtwo views1.04
81
0.38
54
1.35
91
0.50
48
0.39
50
1.61
92
0.89
83
1.29
71
1.39
82
1.29
72
1.50
75
1.24
56
1.24
82
1.37
79
4.87
106
0.21
27
0.19
38
0.25
33
0.25
45
0.29
44
0.27
46
SuperBtwo views1.04
81
0.37
48
4.81
108
0.43
25
0.31
36
0.99
58
0.46
35
1.02
25
2.09
100
1.09
56
1.09
39
0.75
12
0.81
48
1.06
51
1.76
92
0.19
20
0.15
21
0.17
15
0.15
14
2.82
114
0.26
43
ADCP+two views1.06
83
0.35
42
1.51
92
0.49
44
0.51
80
1.45
85
0.55
49
1.25
67
1.03
44
1.16
62
1.15
48
1.10
47
1.07
71
1.56
85
6.75
112
0.19
20
0.16
26
0.23
26
0.21
24
0.28
43
0.21
28
SANettwo views1.07
84
0.41
61
1.06
74
0.48
41
0.30
28
1.41
82
0.94
86
1.42
82
3.36
110
1.58
81
2.43
95
2.33
100
1.70
99
1.34
75
1.07
75
0.28
51
0.24
56
0.26
37
0.23
34
0.34
65
0.30
56
SAMSARAtwo views1.07
84
0.65
97
1.19
83
1.06
110
0.90
105
1.76
97
1.76
109
1.46
85
1.44
85
1.51
79
1.16
52
2.00
97
1.14
75
1.48
82
1.31
86
0.39
92
0.44
103
0.35
74
0.38
84
0.49
95
0.49
101
MFN_U_SF_RVCtwo views1.09
86
0.59
90
1.97
98
0.67
93
0.44
68
1.93
98
0.60
54
1.85
103
1.36
77
1.61
84
1.54
77
1.95
96
1.37
88
1.64
91
1.73
91
0.40
95
0.37
92
0.45
92
0.48
97
0.44
85
0.44
92
MFMNet_retwo views1.14
87
0.97
109
1.19
83
0.99
107
0.95
108
1.25
78
1.22
98
1.39
79
1.55
92
1.69
89
1.67
81
1.21
53
1.53
93
1.12
59
0.99
69
0.87
110
0.75
111
0.79
109
0.75
108
0.90
108
0.94
109
WCMA_ROBtwo views1.14
87
0.36
46
0.76
57
0.49
44
0.44
68
1.23
76
0.67
61
1.11
38
1.23
67
2.84
105
3.95
109
3.28
106
1.83
102
1.25
69
0.99
69
0.44
97
0.34
89
0.34
71
0.37
81
0.41
83
0.40
85
ADCMidtwo views1.14
87
0.52
84
2.23
99
0.53
61
0.43
64
1.07
66
0.75
74
1.23
62
1.05
45
2.15
98
1.44
72
1.50
83
1.43
92
1.56
85
5.01
107
0.26
43
0.24
56
0.29
53
0.31
69
0.37
72
0.33
65
AnyNet_C32two views1.16
90
0.62
95
2.40
100
0.57
69
0.55
87
1.41
82
1.34
101
1.18
51
1.24
69
1.49
78
1.42
68
1.39
73
1.20
81
1.55
83
4.82
105
0.29
55
0.28
68
0.29
53
0.34
78
0.38
76
0.42
87
MSMD_ROBtwo views1.21
91
0.49
75
0.62
33
0.59
71
0.50
77
1.62
93
0.67
61
1.23
62
1.24
69
2.45
102
3.77
105
3.09
105
2.98
108
1.32
74
0.81
44
0.45
98
0.41
99
0.49
100
0.51
102
0.47
93
0.44
92
pmcnntwo views1.23
92
0.29
21
0.96
70
0.40
14
0.23
11
0.95
52
0.69
66
1.16
47
1.28
73
1.67
88
2.33
92
11.15
115
0.86
54
0.93
41
0.88
52
0.15
3
0.12
5
0.12
1
0.10
3
0.15
5
0.15
7
NVStereoNet_ROBtwo views1.24
93
0.90
104
1.13
78
0.86
104
0.88
104
1.11
68
0.99
90
1.42
82
1.53
90
1.59
83
2.48
96
2.24
99
1.53
93
1.96
100
1.30
85
0.76
109
0.70
109
0.78
108
1.00
113
0.71
106
0.88
107
Nikolai Smolyanskiy, Alexey Kamenev, Stan Birchfield: On the Importance of Stereo for Accurate Depth Estimation: An Efficient Semi-Supervised Deep Neural Network Approach. Arxiv
SPS-STEREOcopylefttwo views1.25
94
0.93
105
1.06
74
1.04
109
1.03
111
1.41
82
0.91
85
1.51
90
1.19
61
2.09
97
1.83
84
1.94
95
1.39
89
1.62
88
1.39
88
0.97
111
0.94
112
0.90
110
0.89
110
0.96
110
0.96
110
K. Yamaguchi, D. McAllester, R. Urtasun: Efficient Joint Segmentation, Occlusion Labeling, Stereo and Flow Estimation. ECCV 2014
PVDtwo views1.28
95
0.67
99
1.28
87
0.74
102
0.62
97
1.50
86
0.84
79
1.99
106
2.96
108
2.18
99
2.36
94
1.85
93
1.95
104
1.92
97
1.94
94
0.39
92
0.36
91
0.49
100
0.37
81
0.51
97
0.64
105
MeshStereopermissivetwo views1.34
96
0.43
64
0.54
25
0.44
28
0.34
44
1.66
94
0.60
54
1.94
104
1.37
79
4.47
109
3.25
103
4.71
109
1.94
103
1.92
97
1.17
78
0.35
85
0.35
90
0.37
81
0.31
69
0.32
57
0.32
61
C. Zhang, Z. Li, Y. Cheng, R. Cai, H. Chao, Y. Rui: MeshStereo: A Global Stereo Model with Mesh Alignment Regularization for View Interpolation. ICCV 2015
MSC_U_SF_DS_RVCtwo views1.34
96
0.87
103
1.76
95
0.86
104
0.56
90
4.29
111
1.79
110
1.50
88
1.98
98
1.61
84
1.85
85
1.84
91
1.41
90
1.89
96
1.36
87
0.48
100
0.40
98
0.56
104
0.57
104
0.69
105
0.48
98
SGM+DAISYtwo views1.35
98
0.94
106
1.25
86
0.96
106
1.00
110
1.52
87
1.02
93
1.34
74
1.21
64
2.64
103
2.65
100
2.44
101
1.56
95
1.64
91
1.26
82
0.97
111
0.95
113
0.90
110
0.90
111
0.94
109
0.96
110
ADCStwo views1.51
99
0.61
92
3.42
105
0.59
71
0.51
80
1.27
79
1.08
96
1.59
95
1.81
97
1.91
95
1.66
80
1.51
85
1.42
91
1.94
99
8.60
113
0.33
76
0.32
83
0.33
69
0.36
80
0.44
85
0.42
87
AnyNet_C01two views1.56
100
1.05
110
7.45
109
0.64
82
0.54
85
1.99
99
1.38
102
1.43
84
1.36
77
1.63
87
1.92
87
1.84
91
1.33
87
2.40
104
4.03
101
0.33
76
0.32
83
0.32
64
0.34
78
0.44
85
0.45
94
FC-DCNNcopylefttwo views1.59
101
0.51
81
0.66
42
0.61
78
0.49
76
1.56
89
0.70
69
1.66
99
1.53
90
4.64
110
4.62
110
5.85
111
3.34
109
1.84
95
1.11
77
0.46
99
0.42
101
0.47
96
0.46
94
0.48
94
0.46
95
DispFullNettwo views1.60
102
2.63
114
2.75
102
2.56
114
1.79
112
1.24
77
0.47
36
1.37
78
1.45
86
1.87
92
1.50
75
1.55
86
2.46
106
2.04
102
1.04
74
0.59
107
0.18
34
2.32
113
0.68
107
2.29
112
1.14
112
ELAS_RVCcopylefttwo views1.60
102
0.61
92
1.01
71
0.70
99
0.56
90
2.00
100
1.89
111
1.96
105
2.65
107
3.33
106
3.26
104
3.03
104
3.41
110
2.49
105
2.09
96
0.52
104
0.45
104
0.50
102
0.48
97
0.54
101
0.52
102
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
ELAScopylefttwo views1.63
104
0.61
92
0.95
69
0.68
96
0.55
87
3.01
110
1.55
107
2.39
110
2.33
104
3.64
107
3.89
108
2.75
103
2.90
107
2.15
103
2.27
98
0.52
104
0.45
104
0.50
102
0.48
97
0.54
101
0.52
102
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
PWCKtwo views1.92
105
1.66
111
3.12
104
1.65
112
0.91
106
2.84
108
2.26
112
2.10
107
2.41
106
2.36
101
2.31
91
2.22
98
1.82
101
3.43
107
2.03
95
1.61
113
0.71
110
1.51
112
0.82
109
1.69
111
0.91
108
RTStwo views1.98
106
0.96
107
11.36
113
0.65
87
0.57
94
2.77
106
1.40
103
1.39
79
2.26
102
1.89
93
2.56
97
1.32
64
1.25
83
5.20
109
4.05
102
0.31
62
0.22
45
0.29
53
0.29
61
0.40
78
0.37
78
RTSAtwo views1.98
106
0.96
107
11.36
113
0.65
87
0.57
94
2.77
106
1.40
103
1.39
79
2.26
102
1.89
93
2.56
97
1.32
64
1.25
83
5.20
109
4.05
102
0.31
62
0.22
45
0.29
53
0.29
61
0.40
78
0.37
78
MADNet+two views2.23
108
2.43
112
10.85
112
1.22
111
0.91
106
2.76
105
1.40
103
2.12
108
1.67
95
1.32
73
1.74
83
1.86
94
1.74
100
6.25
111
4.41
104
0.65
108
0.64
108
0.58
106
0.54
103
0.80
107
0.76
106
DPSimNet_ROBtwo views3.58
109
2.45
113
8.57
110
2.53
113
2.37
113
2.88
109
3.77
114
3.14
111
3.23
109
4.71
111
3.86
107
4.68
108
4.87
112
4.07
108
5.62
108
2.33
114
2.33
114
2.47
114
2.52
114
2.51
113
2.64
113
MADNet++two views4.03
110
2.70
115
3.44
107
3.72
115
3.44
114
4.44
112
3.52
113
3.95
113
3.61
111
4.46
108
4.98
112
4.10
107
4.58
111
7.00
112
5.89
109
3.74
115
3.67
115
3.27
115
2.71
115
3.70
115
3.66
114
SGM-ForestMtwo views4.04
111
0.32
28
0.46
16
0.43
25
0.27
21
7.50
114
1.45
106
3.84
112
4.24
112
8.19
115
6.76
113
19.00
116
9.08
113
10.74
116
6.62
111
0.31
62
0.33
87
0.32
64
0.32
73
0.29
44
0.29
53
MANEtwo views4.13
112
0.51
81
0.64
36
0.66
90
0.56
90
6.89
113
0.99
90
7.55
114
12.04
113
6.68
114
8.57
114
10.69
114
9.45
114
7.59
113
6.39
110
0.49
102
0.43
102
0.48
98
0.98
112
0.49
95
0.48
98
LE_ROBtwo views4.42
113
0.24
10
2.75
102
0.42
20
0.23
11
1.57
90
0.90
84
1.71
102
17.88
119
16.56
118
4.88
111
5.17
110
18.85
119
1.63
90
14.62
114
0.17
13
0.14
18
0.16
12
0.16
16
0.19
16
0.17
15
LSMtwo views4.74
114
0.77
101
10.16
111
0.70
99
56.56
121
1.02
61
1.00
92
1.24
66
1.62
94
2.04
96
2.34
93
1.33
66
1.17
78
1.38
80
1.09
76
0.33
76
0.49
106
0.39
85
0.46
94
0.51
97
10.09
115
DGTPSM_ROBtwo views12.07
115
8.00
116
20.99
115
8.93
116
16.78
115
10.67
115
30.87
118
9.01
115
15.16
115
6.50
112
15.41
117
9.04
112
14.85
115
9.25
114
21.67
116
4.57
116
8.57
116
5.08
116
9.28
116
5.90
116
10.95
116
DPSMNet_ROBtwo views12.08
116
8.00
116
21.00
116
8.97
117
16.79
116
10.67
115
30.89
119
9.02
116
15.17
116
6.51
113
15.41
117
9.04
112
14.85
115
9.27
115
21.67
116
4.58
117
8.57
116
5.09
117
9.29
117
5.90
116
10.95
116
BEATNet-Init1two views15.13
117
0.82
102
50.86
122
0.72
101
0.53
83
38.10
121
7.03
115
11.83
117
14.25
114
29.78
120
30.17
120
48.21
122
22.64
120
26.78
119
18.11
115
0.42
96
0.38
94
0.47
96
0.46
94
0.51
97
0.48
98
DPSMtwo views17.78
118
18.63
118
24.31
117
20.90
118
19.47
117
25.97
117
36.21
120
21.23
118
16.24
117
13.38
116
14.06
115
19.18
117
16.30
117
20.78
117
25.67
118
9.38
118
9.23
118
9.49
118
9.82
118
13.51
118
11.91
118
DPSM_ROBtwo views17.78
118
18.63
118
24.31
117
20.90
118
19.47
117
25.97
117
36.21
120
21.23
118
16.24
117
13.38
116
14.06
115
19.18
117
16.30
117
20.78
117
25.67
118
9.38
118
9.23
118
9.49
118
9.82
118
13.51
118
11.91
118
MEDIAN_ROBtwo views37.38
120
40.85
122
40.60
120
32.31
120
31.85
119
27.20
119
24.55
116
29.10
120
33.19
121
42.09
122
41.76
122
33.35
119
34.39
122
39.78
121
37.07
120
43.64
122
44.10
122
44.12
122
43.40
122
41.51
122
42.76
122
AVERAGE_ROBtwo views37.58
121
40.13
121
40.15
119
34.81
121
33.82
120
28.55
120
25.10
117
32.02
121
34.06
122
41.28
121
40.66
121
34.46
120
34.81
123
39.18
120
37.57
121
42.78
121
43.44
121
43.82
121
43.03
121
40.47
121
41.51
121
LSM0two views38.95
122
37.60
120
48.58
121
43.57
122
91.06
122
52.57
122
72.57
122
43.17
122
32.63
120
26.97
119
27.97
119
38.65
121
32.94
121
41.86
122
51.78
122
18.89
120
18.41
120
19.13
120
19.63
120
27.29
120
33.77
120
MSMDNettwo views0.43
5