This table lists the benchmark results for the low-res two-view scenario. This benchmark evaluates the Middlebury stereo metrics:

The mask determines whether the metric is evaluated for all pixels with ground truth, or only for pixels which are visible in both images (non-occluded).
The coverage selector allows to limit the table to results for all pixels (dense), or a given minimum fraction of pixels.

Click one or more dataset result cells or column headers to show visualizations. Most visualizations are only available for training datasets. The visualizations may not work with mobile browsers.




Method Infoalllakes. 1llakes. 1ssand box 1lsand box 1sstora. room 1lstora. room 1sstora. room 2lstora. room 2sstora. room 2 1lstora. room 2 1sstora. room 2 2lstora. room 2 2sstora. room 3lstora. room 3stunnel 1ltunnel 1stunnel 2ltunnel 2stunnel 3ltunnel 3s
sorted bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort by
DN-CSS_ROBtwo views8.17
1
2.59
3
14.14
5
6.11
3
3.22
4
6.31
1
3.53
1
22.92
1
20.59
1
16.66
1
30.05
1
9.31
1
7.95
1
13.49
1
4.78
1
0.72
5
0.00
1
0.45
2
0.00
1
0.54
2
0.09
1
SGM-Foresttwo views12.92
2
1.87
1
6.61
1
5.68
1
2.05
2
23.19
5
11.30
4
34.24
6
30.77
3
26.54
4
31.99
2
29.98
3
15.74
2
21.17
5
13.36
2
1.05
7
0.33
5
0.84
5
0.01
3
0.71
3
0.97
5
CBMVpermissivetwo views13.69
3
3.63
4
8.02
3
5.93
2
2.69
3
22.57
4
12.44
5
29.81
3
31.57
4
31.20
6
33.79
4
31.04
4
17.41
5
25.28
6
14.11
3
0.71
4
0.60
9
0.60
4
0.11
4
1.11
5
1.13
6
Konstantinos Batsos, Changjiang Cai, Philippos Mordohai: CBMV: A Coalesced Bidirectional Matching Volume for Disparity Estimation. Computer Vision and Pattern Recognition (CVPR) 2018
pmcnntwo views14.59
4
3.68
5
19.78
10
6.83
5
4.51
5
21.27
3
14.67
8
27.37
2
30.54
2
27.09
5
40.19
5
38.71
7
18.53
6
17.70
3
19.27
4
0.84
6
0.09
3
0.00
1
0.00
1
0.48
1
0.31
2
PWCDC_ROBbinarytwo views16.82
5
10.19
11
21.71
11
18.36
12
7.59
11
20.29
2
8.69
3
34.26
7
48.58
12
24.48
3
41.66
7
14.84
2
17.24
4
20.73
4
20.78
5
4.90
10
0.37
7
5.72
11
0.50
7
10.59
12
4.97
11
PSMNet_ROBtwo views16.90
6
5.45
7
14.16
6
13.77
10
7.25
10
27.65
7
32.74
12
43.21
12
37.88
10
24.00
2
32.97
3
34.55
5
16.69
3
17.41
2
23.96
7
0.38
1
0.22
4
0.93
6
2.05
10
1.83
6
0.96
4
PWC_ROBbinarytwo views18.37
7
10.55
12
25.19
12
10.21
8
6.29
8
23.75
6
4.79
2
35.93
8
48.75
13
32.56
7
44.46
9
34.68
6
24.78
8
30.99
8
25.72
10
1.38
8
0.08
2
1.57
8
0.26
5
2.18
7
3.17
10
LaLa_ROBtwo views19.19
8
7.15
9
15.93
8
12.96
9
5.30
7
28.30
8
23.21
11
42.51
11
36.69
7
33.45
8
40.81
6
50.96
10
24.53
7
27.92
7
25.66
9
0.60
2
0.44
8
2.04
9
1.22
9
2.54
8
1.47
7
SGM_ROBbinarytwo views19.52
9
2.12
2
6.79
2
6.59
4
1.33
1
38.20
12
15.74
9
38.13
9
34.07
6
45.71
11
41.91
8
51.41
11
38.10
10
38.77
12
27.80
12
0.61
3
0.36
6
0.51
3
0.33
6
1.03
4
0.90
3
Heiko Hirschmueller: Stereo processing by semiglobal matching and mutual information. TPAMI 2008, Volume 30(2), pp. 328-341
WCMA_ROBtwo views20.02
10
4.30
6
19.72
9
9.47
7
7.00
9
32.71
9
13.99
7
31.97
4
32.48
5
41.51
10
52.00
10
44.09
8
36.14
9
32.43
10
24.29
8
6.19
11
2.79
11
1.09
7
1.10
8
3.95
10
3.13
9
MeshStereopermissivetwo views22.27
11
6.87
8
11.15
4
7.69
6
4.87
6
37.70
11
13.06
6
41.64
10
36.95
8
50.92
12
53.41
11
58.12
12
41.93
12
37.73
11
27.62
11
2.52
9
2.37
10
2.99
10
2.07
11
3.09
9
2.60
8
C. Zhang, Z. Li, Y. Cheng, R. Cai, H. Chao, Y. Rui: MeshStereo: A Global Stereo Model with Mesh Alignment Regularization for View Interpolation. ICCV 2015
MSMD_ROBtwo views22.90
12
9.64
10
14.76
7
17.46
11
9.88
12
34.94
10
18.83
10
33.67
5
37.47
9
40.97
9
59.80
12
45.93
9
38.34
11
31.74
9
23.24
6
6.22
12
3.58
12
8.78
12
11.03
14
6.76
11
5.06
12
ELAScopylefttwo views33.68
13
18.95
13
36.71
14
20.64
13
13.61
13
56.43
14
35.14
13
50.03
14
49.90
14
60.58
13
63.83
16
58.64
14
53.36
13
50.73
13
44.44
14
11.03
13
6.58
13
10.02
13
7.43
13
13.58
14
11.94
14
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
ELAS_ROBcopylefttwo views33.79
14
19.15
14
35.97
13
21.28
14
13.79
14
52.58
13
36.11
14
48.96
13
55.66
16
61.47
15
62.91
14
58.35
13
53.55
14
53.37
14
42.23
13
11.25
14
6.64
14
10.32
14
7.05
12
13.56
13
11.58
13
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
SGM+DAISYtwo views54.67
15
56.61
15
62.73
15
40.21
15
53.22
16
61.95
16
48.59
17
55.05
15
44.62
11
61.03
14
63.68
15
60.46
16
55.62
15
60.68
16
55.34
15
56.34
15
48.56
17
50.16
15
49.63
17
52.25
15
56.76
17
SPS-STEREOcopylefttwo views55.62
16
59.14
16
64.16
16
45.05
16
53.84
17
59.88
15
44.00
16
59.53
16
49.94
15
62.33
16
61.09
13
59.80
15
56.82
16
60.06
15
57.85
16
58.41
16
48.34
16
51.07
16
49.21
16
55.41
16
56.48
16
K. Yamaguchi, D. McAllester, R. Urtasun: Efficient Joint Segmentation, Occlusion Labeling, Stereo and Flow Estimation. ECCV 2014
PWCKtwo views65.90
17
88.63
17
79.40
17
80.32
17
29.96
15
63.85
17
39.45
15
72.07
17
71.26
17
78.74
17
68.99
17
77.82
17
61.77
17
82.81
17
62.64
17
80.17
17
39.88
15
81.90
17
47.37
15
73.54
17
37.51
15
MEDIAN_ROBtwo views99.19
18
99.84
18
99.62
19
98.49
18
98.51
18
98.58
18
97.81
19
98.80
18
98.56
18
99.36
18
99.49
18
99.56
18
99.06
18
98.35
18
98.31
18
99.99
18
99.63
18
100.00
18
100.00
18
99.81
18
99.95
18
AVERAGE_ROBtwo views99.80
19
99.99
19
99.40
18
100.00
19
100.00
19
98.99
19
97.72
18
100.00
19
100.00
19
100.00
19
100.00
19
100.00
19
100.00
19
100.00
19
100.00
19
100.00
19
100.00
19
100.00
18
100.00
18
100.00
19
100.00
19