This table lists the benchmark results for the low-res two-view scenario. This benchmark evaluates the Middlebury stereo metrics (for all metrics, smaller is better):

The mask determines whether the metric is evaluated for all pixels with ground truth, or only for pixels which are visible in both images (non-occluded).
The coverage selector allows to limit the table to results for all pixels (dense), or a given minimum fraction of pixels.

Methods with suffix _ROB may participate in the Robust Vision Challenge.

Click one or more dataset result cells or column headers to show visualizations. Most visualizations are only available for training datasets. The visualizations may not work with mobile browsers.




Method Infoalllakes. 1llakes. 1ssand box 1lsand box 1sstora. room 1lstora. room 1sstora. room 2lstora. room 2sstora. room 2 1lstora. room 2 1sstora. room 2 2lstora. room 2 2sstora. room 3lstora. room 3stunnel 1ltunnel 1stunnel 2ltunnel 2stunnel 3ltunnel 3s
sorted bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort by
StereoDRNet-Filteredtwo views1.77
1
0.81
7
2.22
2
1.21
9
0.76
3
3.26
5
0.71
3
3.82
8
2.05
2
2.29
6
2.05
11
2.98
15
1.44
4
7.15
10
2.27
11
0.30
1
0.23
5
0.42
13
0.33
7
0.45
9
0.64
25
iResNet_ROBtwo views1.89
2
1.02
14
8.60
10
1.39
20
0.98
14
3.72
16
0.66
2
3.84
9
2.65
15
1.88
2
2.05
11
1.98
3
1.27
3
4.52
8
1.67
3
0.30
1
0.20
2
0.28
2
0.18
1
0.31
1
0.40
4
DN-CSS_ROBtwo views1.91
3
1.55
32
11.13
14
1.82
30
0.88
8
3.99
24
0.61
1
4.01
13
1.67
1
1.64
1
1.89
5
1.39
2
1.06
1
3.23
2
1.50
1
0.31
3
0.19
1
0.44
14
0.29
5
0.37
4
0.31
2
LALA_ROBtwo views1.95
4
1.48
29
3.11
6
1.05
1
0.98
14
3.99
24
1.82
20
4.44
28
2.28
8
3.31
20
1.85
4
2.76
12
1.67
9
4.04
6
3.23
18
0.46
18
0.41
19
0.55
26
0.52
25
0.60
22
0.57
17
DLCB_ROBtwo views2.05
5
0.95
11
2.37
3
1.24
12
1.02
17
2.92
1
1.30
11
3.68
6
2.30
9
2.91
15
2.17
16
3.32
16
1.52
6
10.37
19
2.50
13
0.40
11
0.33
9
0.40
11
0.36
10
0.41
5
0.47
10
ETE_ROBtwo views2.07
6
1.40
26
3.01
5
1.09
2
0.92
11
3.94
22
1.53
15
4.25
19
2.23
4
3.31
20
2.06
13
2.35
6
1.52
6
8.74
13
2.07
7
0.44
15
0.36
13
0.51
23
0.48
22
0.54
15
0.62
22
DISCOtwo views2.25
7
0.73
5
10.06
12
1.76
29
1.12
19
3.47
10
1.25
10
3.24
1
2.26
7
2.65
10
1.96
7
4.53
26
1.92
13
4.17
7
3.61
20
0.32
4
0.25
7
0.31
5
0.29
5
0.66
25
0.41
6
XPNet_ROBtwo views2.27
8
1.11
17
4.10
8
1.36
19
0.97
12
3.94
22
1.63
16
4.07
15
2.14
3
2.82
13
2.00
9
2.32
5
1.73
10
10.81
23
3.39
19
0.51
21
0.47
25
0.48
17
0.40
18
0.51
12
0.56
16
NOSS_ROBtwo views2.27
8
0.73
5
3.00
4
1.71
27
1.00
16
3.90
21
0.97
6
4.29
21
2.81
20
2.94
16
2.19
17
3.88
20
1.19
2
11.77
30
1.62
2
0.59
26
0.54
26
0.56
27
0.52
25
0.56
18
0.54
15
HSMtwo views2.35
10
0.81
7
2.21
1
1.14
5
1.55
25
3.44
7
1.03
7
3.99
12
2.25
6
2.29
6
2.11
14
8.04
33
3.05
22
11.60
27
1.71
4
0.33
5
0.22
4
0.28
2
0.23
4
0.32
2
0.35
3
iResNettwo views2.52
11
0.94
10
21.17
34
1.84
31
0.72
2
3.75
17
0.81
4
4.14
17
2.55
14
2.33
8
2.03
10
1.35
1
1.44
4
3.27
3
2.16
10
0.35
8
0.23
5
0.28
2
0.22
3
0.44
7
0.45
8
NCCL2two views2.53
12
1.35
22
10.90
13
1.15
6
0.97
12
3.53
13
2.56
28
3.65
5
2.23
4
2.76
12
1.80
2
2.41
7
1.66
8
10.35
18
2.07
7
0.44
15
0.40
18
0.57
28
0.53
28
0.59
19
0.65
26
PWC_ROBbinarytwo views2.69
13
2.01
37
8.36
9
1.33
15
1.90
34
3.23
3
1.18
9
4.33
25
3.25
25
3.11
17
3.41
25
2.71
9
2.18
15
9.15
15
4.48
22
0.56
24
0.31
8
0.61
30
0.36
10
0.69
27
0.74
28
StereoDRNettwo views2.84
14
1.49
30
14.48
18
1.33
15
1.57
28
3.44
7
2.06
23
3.69
7
2.36
11
3.26
19
1.84
3
2.79
13
1.80
12
10.88
24
3.16
17
0.41
12
0.33
9
0.39
8
0.36
10
0.59
19
0.58
20
PDISCO_ROBtwo views2.86
15
1.53
31
14.27
17
1.98
34
2.19
37
4.52
33
1.35
12
4.65
34
3.19
24
2.09
5
2.16
15
3.53
18
2.45
18
5.32
9
2.12
9
1.30
37
0.78
33
0.72
32
0.59
30
1.24
36
1.28
33
MFMNet_retwo views2.91
16
1.91
36
18.25
26
3.06
39
1.89
33
3.13
2
1.67
18
3.52
2
2.97
21
3.14
18
2.87
21
2.26
4
2.41
16
2.42
1
2.00
6
1.10
34
1.05
34
0.94
35
0.90
34
1.28
37
1.46
36
PSMNet_ROBtwo views2.95
17
1.38
24
18.46
27
1.19
7
1.20
21
3.80
19
1.65
17
3.54
3
2.35
10
1.98
3
1.71
1
2.75
10
1.79
11
11.43
26
2.78
15
0.42
13
0.36
13
0.50
21
0.60
31
0.59
19
0.49
12
PWCDC_ROBbinarytwo views2.95
17
3.26
38
14.06
16
1.59
25
1.56
26
4.19
29
0.86
5
4.30
23
4.08
32
2.06
4
7.37
28
2.75
10
2.42
17
3.59
4
2.67
14
1.34
39
0.36
13
0.39
8
0.34
8
1.13
32
0.62
22
SGM-Foresttwo views2.96
19
0.62
2
3.49
7
1.09
2
0.80
4
4.06
26
3.30
30
4.37
26
3.09
23
4.18
28
3.27
23
4.67
27
3.86
24
11.69
29
7.91
30
0.51
21
0.46
24
0.49
19
0.39
16
0.48
10
0.49
12
Johannes L. Schönberger, Sudipta Sinha, Marc Pollefeys: Learning to Fuse Proposals from Multiple Scanline Optimizations in Semi-Global Matching. ECCV 2018
DRN-Testtwo views3.03
20
0.99
13
18.18
25
1.35
18
1.60
29
3.59
14
1.48
13
4.13
16
2.45
12
3.47
23
1.92
6
2.58
8
2.53
19
10.88
24
2.80
16
0.39
10
0.36
13
0.44
14
0.41
19
0.55
16
0.57
17
FBW_ROBtwo views3.24
21
1.02
14
9.21
11
1.49
23
1.07
18
3.78
18
1.48
13
4.31
24
3.08
22
2.46
9
2.32
19
4.20
24
2.08
14
15.19
38
7.16
29
0.84
30
0.62
31
1.33
38
0.92
35
0.99
30
1.31
34
CBMVpermissivetwo views3.32
22
0.96
12
21.16
33
1.32
14
0.83
5
3.81
20
3.91
36
4.47
29
2.53
13
3.34
22
2.24
18
4.45
25
2.58
20
10.07
17
1.96
5
0.45
17
0.43
21
0.48
17
0.38
15
0.52
14
0.53
14
Konstantinos Batsos, Changjiang Cai, Philippos Mordohai: CBMV: A Coalesced Bidirectional Matching Volume for Disparity Estimation. Computer Vision and Pattern Recognition (CVPR) 2018
testNettwo views3.43
23
1.37
23
12.19
15
1.39
20
1.24
22
4.15
28
2.06
23
4.51
30
3.34
28
2.69
11
1.96
7
3.52
17
5.56
30
10.68
21
9.68
34
0.93
31
0.73
32
0.40
11
0.37
14
1.06
31
0.86
31
CBMV_ROBtwo views3.62
24
0.70
3
21.42
37
1.21
9
0.89
10
4.43
32
1.76
19
4.39
27
2.79
18
5.88
31
3.38
24
4.18
23
5.12
28
10.75
22
2.32
12
0.56
24
0.55
27
0.53
25
0.51
24
0.55
16
0.46
9
SPS-STEREOcopylefttwo views3.97
25
1.44
28
20.85
31
1.72
28
1.71
30
3.24
4
2.30
27
3.55
4
2.68
17
4.79
30
3.66
26
5.44
29
2.97
21
9.11
14
9.35
33
1.15
35
1.10
35
1.06
36
1.02
37
1.16
34
1.18
32
K. Yamaguchi, D. McAllester, R. Urtasun: Efficient Joint Segmentation, Occlusion Labeling, Stereo and Flow Estimation. ECCV 2014
NaN_ROBtwo views4.06
26
1.13
18
16.64
21
1.12
4
1.25
23
3.50
12
3.05
29
4.29
21
3.26
26
3.64
24
2.56
20
2.97
14
4.79
27
13.33
34
16.91
38
0.37
9
0.41
19
0.38
7
0.39
16
0.48
10
0.81
29
SGM_ROBbinarytwo views4.57
27
0.72
4
17.46
24
1.84
31
0.58
1
4.23
30
3.45
31
4.66
35
3.66
30
7.99
34
8.82
35
7.56
32
8.74
36
12.30
33
6.84
27
0.43
14
0.38
17
0.39
8
0.35
9
0.51
12
0.48
11
Heiko Hirschmueller: Stereo processing by semiglobal matching and mutual information. TPAMI 2008, Volume 30(2), pp. 328-341
DispFullNettwo views4.64
28
3.48
39
23.95
39
3.99
40
3.03
40
3.35
6
1.06
8
4.18
18
2.80
19
6.33
32
2.94
22
4.04
21
7.81
34
3.92
5
5.16
25
2.75
40
1.61
41
3.10
41
2.68
41
3.32
40
3.31
41
CSANtwo views4.71
29
1.39
25
20.23
28
1.19
7
0.87
7
3.65
15
3.50
33
4.57
32
3.87
31
3.89
27
7.82
30
3.54
19
6.16
32
23.47
45
6.82
26
0.52
23
0.45
23
0.49
19
0.52
25
0.63
23
0.66
27
NVStereoNet_ROBtwo views4.91
30
1.40
26
16.46
20
1.47
22
1.72
31
3.46
9
1.96
21
3.89
10
3.61
29
2.90
14
24.61
45
4.81
28
4.49
26
13.57
35
4.87
24
1.33
38
1.29
39
1.70
39
1.68
40
1.41
39
1.66
38
Nikolai Smolyanskiy, Alexey Kamenev, Stan Birchfield: On the Importance of Stereo for Accurate Depth Estimation: An Efficient Semi-Supervised Deep Neural Network Approach. Arxiv
WCMA_ROBtwo views5.12
31
0.90
9
21.33
35
1.50
24
1.33
24
4.09
27
3.49
32
4.03
14
4.59
34
6.57
33
10.95
36
12.92
38
8.41
35
9.86
16
8.78
32
0.75
29
0.58
28
0.51
23
0.50
23
0.68
26
0.57
17
MeshStereopermissivetwo views5.35
32
1.32
21
21.34
36
1.34
17
1.15
20
4.40
31
3.90
35
5.18
38
4.55
33
10.14
40
11.82
37
11.47
36
5.84
31
13.84
36
7.03
28
0.61
27
0.58
28
0.60
29
0.58
29
0.72
28
0.59
21
C. Zhang, Z. Li, Y. Cheng, R. Cai, H. Chao, Y. Rui: MeshStereo: A Global Stereo Model with Mesh Alignment Regularization for View Interpolation. ICCV 2015
pmcnntwo views5.45
33
1.22
20
17.40
23
1.60
26
2.41
38
3.48
11
2.12
26
4.52
31
3.26
26
3.67
25
13.15
38
22.10
41
3.46
23
11.92
32
17.16
39
0.34
6
0.21
3
0.23
1
0.19
2
0.36
3
0.28
1
MDST_ROBtwo views5.53
34
0.59
1
23.11
38
2.55
37
2.69
39
21.20
43
1.98
22
4.77
37
2.65
15
9.36
37
4.25
27
4.14
22
4.45
25
22.57
41
3.68
21
0.48
19
0.33
9
0.47
16
0.44
21
0.44
7
0.44
7
MSMD_ROBtwo views5.61
35
1.08
16
16.68
22
1.21
9
0.83
5
10.14
38
2.07
25
3.93
11
6.22
37
10.37
41
8.03
32
22.22
42
9.45
37
11.66
28
4.52
23
0.66
28
0.58
28
0.66
31
0.68
32
0.65
24
0.63
24
SANettwo views5.65
36
1.72
35
20.94
32
1.26
13
0.88
8
7.20
36
3.81
34
4.60
33
11.23
39
4.56
29
8.78
34
7.49
31
6.95
33
11.85
31
18.11
40
0.49
20
0.44
22
0.50
21
0.42
20
0.90
29
0.82
30
ELAS_ROBcopylefttwo views5.72
37
1.64
34
20.40
30
2.09
36
2.01
35
4.95
34
11.47
39
4.66
35
6.31
38
8.09
35
8.01
31
8.06
35
11.65
39
8.36
12
10.22
35
0.99
33
1.17
36
0.81
34
0.76
33
1.14
33
1.58
37
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
PWCKtwo views5.85
38
9.95
43
15.22
19
2.90
38
1.56
26
5.66
35
7.69
37
5.27
40
4.74
35
3.88
26
7.51
29
8.04
33
5.28
29
15.18
37
8.40
31
3.08
41
1.51
40
2.24
40
1.37
39
5.55
41
1.96
40
ELAScopylefttwo views5.99
39
1.62
33
20.29
29
1.96
33
2.01
35
15.09
42
8.38
38
5.23
39
5.54
36
9.02
36
8.61
33
6.74
30
9.75
38
7.48
11
11.40
36
0.96
32
1.18
37
0.72
32
0.93
36
1.30
38
1.67
39
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
SGM+DAISYtwo views8.36
40
3.82
40
24.83
40
2.02
35
1.75
32
14.80
41
17.22
40
4.27
20
11.79
40
13.88
42
14.38
39
11.74
37
13.83
40
10.56
20
15.12
37
1.21
36
1.19
38
1.14
37
1.13
38
1.17
35
1.37
35
DGTPSM_ROBtwo views15.31
41
8.85
41
28.14
43
9.90
42
18.89
41
12.44
39
33.21
43
13.48
42
17.86
43
9.50
38
22.89
43
13.78
39
19.67
41
21.26
39
23.68
41
5.08
42
8.84
42
5.62
42
11.50
42
6.72
42
14.87
42
DPSMNet_ROBtwo views15.52
42
8.89
42
31.44
44
10.07
43
18.91
42
12.44
39
33.22
44
13.51
43
17.94
44
9.58
39
22.91
44
13.79
40
19.67
41
21.43
40
23.73
42
5.17
43
8.84
42
5.63
43
11.52
45
6.79
43
14.90
43
LE_ROBtwo views18.58
43
1.21
19
48.38
46
8.60
41
23.00
45
7.32
37
19.74
41
9.94
41
38.69
45
43.53
45
21.91
42
28.40
45
40.92
46
23.18
42
54.71
47
0.34
6
0.33
9
0.32
6
0.36
10
0.42
6
0.40
4
DPSMtwo views20.78
44
20.03
44
26.57
41
23.62
44
22.46
43
31.10
44
39.97
46
33.52
45
17.41
41
15.25
43
15.80
40
24.06
43
20.71
43
23.30
43
27.62
43
10.78
44
9.41
44
11.69
44
11.50
42
14.49
44
16.36
44
DPSM_ROBtwo views20.78
44
20.03
44
26.57
41
23.62
44
22.46
43
31.10
44
39.97
46
33.52
45
17.41
41
15.25
43
15.80
40
24.06
43
20.71
43
23.30
43
27.62
43
10.78
44
9.41
44
11.69
44
11.50
42
14.49
44
16.36
44
AVERAGE_ROBtwo views44.46
46
45.91
46
46.59
45
40.59
47
38.66
46
32.99
46
30.52
42
46.19
47
43.26
46
49.59
46
48.97
46
44.62
46
39.89
45
45.04
46
43.94
45
48.74
46
50.69
46
50.80
46
49.79
46
45.54
46
46.97
46
MEDIAN_ROBtwo views47.37
47
49.08
47
49.92
47
40.32
46
39.97
47
34.95
47
33.43
45
49.17
48
46.39
47
53.06
47
52.65
47
47.65
47
42.23
47
48.35
47
47.29
46
52.10
47
54.04
47
54.28
47
53.48
47
48.76
47
50.33
47
DPSimNet_ROBtwo views102.30
48
119.72
48
143.59
48
109.42
48
94.10
48
127.22
48
127.15
48
26.62
44
102.36
48
149.58
48
83.99
48
139.34
48
137.54
48
64.62
48
82.36
48
105.18
48
66.15
48
120.15
48
69.61
48
99.58
48
77.72
48