This table lists the benchmark results for the low-res two-view scenario. This benchmark evaluates the Middlebury stereo metrics:

The mask determines whether the metric is evaluated for all pixels with ground truth, or only for pixels which are visible in both images (non-occluded).
The coverage selector allows to limit the table to results for all pixels (dense), or a given minimum fraction of pixels.

Click one or more dataset result cells or column headers to show visualizations. Most visualizations are only available for training datasets. The visualizations may not work with mobile browsers.




Method Infoalldeliv. area 1ldeliv. area 1sdeliv. area 2ldeliv. area 2sdeliv. area 3ldeliv. area 3select. 1lelect. 1select. 2lelect. 2select. 3lelect. 3sfacade 1sforest 1sforest 2splayg. 1lplayg. 1splayg. 2lplayg. 2splayg. 3lplayg. 3sterra. 1sterra. 2sterra. 1lterra. 1sterra. 2lterra. 2s
sorted bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort by
DN-CSS_ROBtwo views1.45
1
0.52
1
1.11
3
0.81
1
0.69
1
0.00
1
0.03
2
2.86
5
0.35
1
1.50
1
3.76
3
0.73
1
0.36
1
0.07
1
2.89
1
1.44
1
1.08
2
0.79
2
8.61
4
0.23
1
3.25
2
1.97
1
0.02
1
0.66
3
0.10
1
0.52
1
3.12
1
1.56
1
SGM-Foresttwo views2.81
2
2.03
4
0.36
1
6.32
7
1.79
4
2.66
6
0.02
1
1.42
1
0.77
2
2.52
3
0.96
1
6.57
3
5.74
4
0.31
2
4.36
2
2.94
4
0.61
1
0.33
1
3.08
1
0.47
2
2.46
1
2.73
3
0.84
6
0.45
1
3.66
5
3.74
5
11.30
5
7.44
4
PWCDC_ROBbinarytwo views3.18
3
0.72
2
0.67
2
1.25
4
1.09
2
0.74
2
0.72
4
1.64
2
1.16
3
1.71
2
4.97
4
8.15
4
6.28
5
1.00
9
7.57
7
6.04
9
4.52
7
1.44
4
8.53
3
2.41
6
4.97
5
6.26
7
0.69
5
0.59
2
0.55
2
3.17
3
4.05
2
5.07
2
LaLa_ROBtwo views4.04
4
4.73
7
1.42
5
1.16
3
1.93
5
2.17
5
1.04
5
1.80
3
2.48
5
2.74
4
5.87
5
33.34
9
0.69
2
0.78
7
6.02
4
2.33
3
3.21
3
1.82
5
3.66
2
1.88
4
3.40
3
4.99
5
0.34
2
1.03
4
1.91
3
3.61
4
8.50
4
6.20
3
CBMVpermissivetwo views5.52
5
3.65
6
1.32
4
3.46
5
2.86
6
2.04
4
0.41
3
3.64
6
2.63
6
4.09
7
2.36
2
14.13
5
3.70
3
0.57
4
7.16
6
6.19
10
4.26
6
2.36
6
8.76
5
1.44
3
6.43
6
4.53
4
0.54
4
5.62
8
4.85
6
6.08
7
19.41
7
26.62
9
Konstantinos Batsos, Changjiang Cai, Philippos Mordohai: CBMV: A Coalesced Bidirectional Matching Volume for Disparity Estimation. Computer Vision and Pattern Recognition (CVPR) 2018
PWC_ROBbinarytwo views7.15
6
1.85
3
5.57
9
0.85
2
1.17
3
1.95
3
4.77
7
2.56
4
17.27
12
3.83
5
7.66
7
4.97
2
39.20
12
0.81
8
8.35
8
4.96
8
6.51
9
4.27
8
15.99
6
4.39
8
7.48
7
7.59
9
0.53
3
4.79
7
2.74
4
2.63
2
7.01
3
23.26
6
pmcnntwo views8.05
7
2.96
5
2.25
6
6.23
6
5.03
7
6.98
7
4.33
6
4.70
7
2.02
4
4.19
8
9.20
9
19.11
6
20.39
7
0.73
6
7.05
5
6.73
11
6.29
8
4.51
9
22.54
11
4.66
9
9.62
8
5.08
6
4.46
9
4.23
6
5.99
7
5.22
6
17.26
6
25.55
8
SGM_ROBbinarytwo views10.08
8
10.50
9
3.60
7
16.68
8
8.80
8
14.87
8
7.11
8
9.19
8
6.25
7
4.06
6
7.33
6
24.89
7
18.29
6
0.57
4
4.57
3
1.98
2
3.30
4
1.17
3
20.75
9
2.20
5
4.80
4
2.23
2
2.29
8
3.65
5
17.68
10
13.31
9
33.30
12
28.67
11
Heiko Hirschmueller: Stereo processing by semiglobal matching and mutual information. TPAMI 2008, Volume 30(2), pp. 328-341
MeshStereopermissivetwo views13.21
9
10.28
8
5.53
8
24.29
11
14.71
9
17.26
9
7.30
9
12.03
9
7.47
8
4.66
9
9.13
8
38.56
11
29.80
8
0.45
3
9.43
11
7.37
12
3.76
5
2.44
7
18.85
7
2.41
6
12.93
12
7.18
8
1.21
7
5.79
9
22.04
11
16.38
11
33.15
11
32.34
12
C. Zhang, Z. Li, Y. Cheng, R. Cai, H. Chao, Y. Rui: MeshStereo: A Global Stereo Model with Mesh Alignment Regularization for View Interpolation. ICCV 2015
ELAS_ROBcopylefttwo views19.26
10
19.85
12
14.31
12
33.92
12
22.40
12
33.39
13
19.24
12
15.07
10
10.89
9
8.20
11
17.30
11
54.84
14
32.79
10
3.23
11
8.72
9
3.46
6
11.25
11
6.19
12
19.97
8
7.86
11
12.63
11
12.02
12
9.45
11
10.70
11
30.22
13
22.17
13
41.23
15
38.75
14
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
ELAScopylefttwo views19.48
11
22.90
13
14.50
13
35.26
13
24.25
13
31.48
12
19.80
13
16.72
12
15.91
11
7.71
10
15.62
10
48.88
13
30.47
9
3.20
10
9.19
10
3.39
5
10.97
10
5.79
10
21.49
10
6.53
10
11.69
10
11.90
11
9.30
10
9.63
10
32.86
14
24.17
14
41.19
14
41.31
15
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
SPS-STEREOcopylefttwo views20.13
12
14.92
10
11.88
10
19.84
9
20.39
10
18.24
10
9.27
10
16.06
11
19.74
13
19.91
13
20.06
13
33.66
10
37.68
11
7.23
12
13.17
13
16.66
15
26.75
14
16.01
15
32.30
12
23.65
14
30.25
15
22.84
15
13.67
12
13.47
13
16.56
8
15.26
10
29.55
9
24.45
7
K. Yamaguchi, D. McAllester, R. Urtasun: Efficient Joint Segmentation, Occlusion Labeling, Stereo and Flow Estimation. ECCV 2014
SGM+DAISYtwo views20.91
13
16.10
11
12.43
11
22.81
10
21.80
11
24.45
11
18.63
11
17.89
13
12.77
10
18.42
12
19.55
12
30.39
8
40.48
13
11.67
13
12.56
12
15.10
14
25.73
13
12.69
13
34.93
13
21.56
13
26.12
14
22.15
14
21.16
13
12.73
12
17.44
9
13.30
8
33.35
13
28.26
10
PSMNet_ROBtwo views31.36
14
48.75
14
16.32
14
64.48
14
46.97
15
60.77
14
61.30
15
19.02
14
20.70
14
27.57
15
24.90
14
41.43
12
42.18
14
83.15
15
13.95
14
4.06
7
17.90
12
5.94
11
42.51
14
15.37
12
11.21
9
9.54
10
57.95
15
14.27
14
29.64
12
19.74
12
29.38
8
17.81
5
PWCKtwo views38.25
15
63.47
15
23.98
15
73.60
15
41.23
14
60.98
15
51.31
14
54.00
15
35.86
15
26.51
14
39.23
15
62.00
15
46.09
15
26.43
14
14.46
15
9.53
13
30.80
15
12.71
14
83.96
15
28.46
15
23.42
13
13.60
13
47.98
14
30.51
15
34.48
15
27.91
15
31.53
10
38.61
13
MEDIAN_ROBtwo views97.56
16
99.08
16
98.39
16
100.00
16
99.99
16
100.00
16
100.00
16
99.49
16
98.31
16
100.00
16
99.95
16
97.97
16
96.05
16
99.97
16
99.41
16
98.43
16
98.53
16
98.66
16
99.88
16
97.95
16
93.38
16
96.45
16
100.00
16
97.23
16
86.96
16
92.33
16
89.80
16
95.90
17
AVERAGE_ROBtwo views98.95
17
100.00
17
99.79
17
100.00
16
100.00
17
100.00
16
100.00
16
100.00
17
100.00
17
100.00
16
100.00
17
100.00
17
100.00
17
100.00
17
99.77
17
99.85
17
100.00
17
100.00
17
100.00
17
100.00
17
98.12
17
99.51
17
100.00
16
100.00
17
93.43
17
93.75
17
94.35
17
93.16
16