This table lists the benchmark results for the low-res two-view scenario. This benchmark evaluates the Middlebury stereo metrics (for all metrics, smaller is better):

The mask determines whether the metric is evaluated for all pixels with ground truth, or only for pixels which are visible in both images (non-occluded).
The coverage selector allows to limit the table to results for all pixels (dense), or a given minimum fraction of pixels.

Methods with suffix _ROB may participate in the Robust Vision Challenge.

Click one or more dataset result cells or column headers to show visualizations. Most visualizations are only available for training datasets. The visualizations may not work with mobile browsers.




Method Infoalllakes. 1llakes. 1ssand box 1lsand box 1sstora. room 1lstora. room 1sstora. room 2lstora. room 2sstora. room 2 1lstora. room 2 1sstora. room 2 2lstora. room 2 2sstora. room 3lstora. room 3stunnel 1ltunnel 1stunnel 2ltunnel 2stunnel 3ltunnel 3s
sorted bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort by
R-Stereotwo views0.39
1
0.18
1
0.42
11
0.42
16
0.26
15
0.40
1
0.32
5
0.72
1
0.34
1
0.89
19
1.28
49
0.57
2
0.35
1
0.42
1
0.32
1
0.15
3
0.12
4
0.14
4
0.12
5
0.14
1
0.13
3
R-Stereo Traintwo views0.39
1
0.18
1
0.42
11
0.42
16
0.26
15
0.40
1
0.32
5
0.72
1
0.34
1
0.89
19
1.28
49
0.57
2
0.35
1
0.42
1
0.32
1
0.15
3
0.12
4
0.14
4
0.12
5
0.14
1
0.13
3
HITNettwo views0.41
3
0.26
11
0.45
14
0.33
1
0.17
1
0.63
11
0.30
4
0.93
8
0.64
3
0.71
4
0.82
1
0.64
4
0.52
7
0.59
3
0.43
6
0.14
2
0.10
1
0.15
7
0.13
8
0.19
13
0.12
1
DN-CSS_ROBtwo views0.43
4
0.26
11
0.62
27
0.40
11
0.29
22
0.44
3
0.27
1
0.77
3
0.75
5
0.72
5
1.09
28
0.48
1
0.43
3
0.69
9
0.37
3
0.15
3
0.12
4
0.21
14
0.19
15
0.18
9
0.16
8
MLCVtwo views0.47
5
0.24
6
0.61
26
0.37
4
0.17
1
0.61
6
0.33
9
0.80
4
0.87
9
0.94
26
1.13
34
0.72
6
0.43
3
0.89
29
0.51
14
0.13
1
0.10
1
0.13
2
0.11
3
0.14
1
0.12
1
ccstwo views0.48
6
0.24
6
0.39
4
0.36
2
0.22
6
0.69
18
0.37
12
0.92
7
0.88
10
0.77
8
1.04
19
0.91
22
0.57
10
0.71
11
0.41
5
0.19
15
0.16
21
0.22
15
0.22
24
0.19
13
0.17
11
CFNet_RVCtwo views0.48
6
0.23
5
0.36
2
0.40
11
0.26
15
0.64
12
0.37
12
0.87
6
0.85
8
0.64
1
1.00
14
1.01
28
0.64
21
0.61
4
0.53
19
0.23
26
0.15
17
0.24
27
0.21
19
0.23
22
0.18
15
iResNettwo views0.49
8
0.29
18
0.79
48
0.42
16
0.21
3
0.61
6
0.40
17
0.82
5
0.99
25
0.87
18
1.12
32
0.69
5
0.53
8
0.78
17
0.51
14
0.15
3
0.11
3
0.14
4
0.12
5
0.17
6
0.15
5
AdaStereotwo views0.50
9
0.32
23
0.47
17
0.48
31
0.25
12
0.67
16
0.32
5
1.17
36
0.88
10
0.75
7
0.99
13
0.75
7
0.49
6
0.67
6
0.46
9
0.27
38
0.12
4
0.28
40
0.18
14
0.23
22
0.18
15
Xiao Song, Guorun Yang, Xinge Zhu, Hui Zhou, Zhe Wang, Jianping Shi: AdaStereo: A Simple and Efficient Approach for Adaptive Stereo Matching. ArXiv
ccs_robtwo views0.50
9
0.24
6
0.40
7
0.36
2
0.22
6
0.76
24
0.33
9
1.01
15
0.96
22
0.81
11
1.05
21
0.94
23
0.57
10
0.70
10
0.44
8
0.20
20
0.15
17
0.22
15
0.23
31
0.18
9
0.17
11
CFNettwo views0.51
11
0.29
18
0.46
15
0.41
14
0.27
18
0.70
19
0.32
5
1.13
29
1.00
28
0.72
5
1.07
25
0.85
14
0.60
15
0.76
13
0.52
17
0.19
15
0.14
15
0.23
22
0.23
31
0.21
17
0.16
8
HSMtwo views0.51
11
0.24
6
0.41
8
0.39
8
0.24
11
0.75
23
0.40
17
1.22
44
0.75
5
0.81
11
0.91
7
1.21
41
0.69
25
0.87
24
0.46
9
0.17
9
0.13
11
0.17
12
0.14
10
0.17
6
0.17
11
iResNetv2_ROBtwo views0.52
13
0.27
14
0.69
40
0.38
7
0.25
12
0.62
9
0.35
11
1.17
36
1.01
30
0.99
30
1.06
24
0.88
20
0.67
22
0.66
5
0.39
4
0.19
15
0.13
11
0.15
7
0.11
3
0.18
9
0.16
8
iResNet_ROBtwo views0.52
13
0.21
3
0.53
20
0.37
4
0.21
3
0.61
6
0.29
3
1.51
71
1.32
56
0.90
21
1.02
16
0.80
9
0.62
18
0.76
13
0.43
6
0.16
8
0.12
4
0.13
2
0.09
1
0.15
4
0.18
15
NLCA_NET_v2_RVCtwo views0.53
15
0.32
23
0.64
29
0.44
23
0.30
23
0.66
14
0.44
22
1.04
20
0.88
10
0.91
23
0.91
7
0.99
27
0.60
15
0.68
7
0.60
21
0.21
22
0.17
25
0.22
15
0.22
24
0.21
17
0.21
23
Zhibo Rao, Mingyi He, Yuchao Dai, Zhidong Zhu, Bo Li, and Renjie He.: NLCA-Net: A non-local context attention network for stereo matching.
CC-Net-ROBtwo views0.53
15
0.32
23
0.64
29
0.44
23
0.30
23
0.65
13
0.44
22
1.02
16
0.88
10
0.92
24
0.92
9
0.98
26
0.61
17
0.68
7
0.61
22
0.21
22
0.17
25
0.22
15
0.21
19
0.21
17
0.21
23
HSM-Net_RVCpermissivetwo views0.53
15
0.22
4
0.35
1
0.37
4
0.21
3
0.85
30
0.37
12
1.23
47
0.93
16
0.90
21
1.15
36
1.26
45
0.63
19
0.72
12
0.48
12
0.17
9
0.13
11
0.16
10
0.15
11
0.17
6
0.15
5
Gengshan Yang, Joshua Manela, Michael Happold, and Deva Ramanan: Hierarchical Deep Stereo Matching on High-resolution Images. CVPR 2019
DeepPruner_ROBtwo views0.53
15
0.34
34
0.59
25
0.41
14
0.30
23
0.50
4
0.43
21
1.19
39
0.70
4
0.85
15
1.05
21
0.86
15
0.55
9
0.84
21
0.51
14
0.29
45
0.23
44
0.22
15
0.22
24
0.26
29
0.25
33
NOSS_ROBtwo views0.57
19
0.39
45
0.39
4
0.44
23
0.30
23
0.73
21
0.54
34
1.03
17
1.08
38
0.67
3
0.87
3
0.76
8
0.57
10
0.77
15
0.47
11
0.38
69
0.38
74
0.44
73
0.42
75
0.37
57
0.34
53
CVANet_RVCtwo views0.57
19
0.33
31
0.44
13
0.47
30
0.32
32
0.67
16
0.61
38
1.05
21
0.93
16
0.85
15
1.04
19
1.01
28
0.73
32
0.83
20
0.52
17
0.29
45
0.19
31
0.27
37
0.25
40
0.31
42
0.21
23
AANet_RVCtwo views0.57
19
0.32
23
0.54
21
0.45
28
0.27
18
0.57
5
0.48
25
1.00
12
1.28
54
0.86
17
1.33
51
0.87
16
0.58
13
1.07
39
0.75
30
0.18
13
0.12
4
0.15
7
0.13
8
0.18
9
0.18
15
TDLMtwo views0.57
19
0.33
31
0.49
19
0.51
42
0.30
23
0.66
14
0.73
50
1.03
17
1.01
30
0.82
14
1.10
29
0.84
13
0.70
29
0.86
23
0.58
20
0.28
41
0.17
25
0.25
29
0.22
24
0.26
29
0.19
20
StereoDRNet-Refinedtwo views0.58
23
0.32
23
0.54
21
0.46
29
0.30
23
0.96
36
0.40
17
1.00
12
0.94
19
1.12
41
1.08
26
0.96
24
0.67
22
0.90
30
0.89
41
0.17
9
0.13
11
0.22
15
0.21
19
0.20
16
0.19
20
Rohan Chabra, Julian Straub, Chris Sweeney, Richard Newcombe, Henry Fuchs: StereoDRNet. CVPR
DLCB_ROBtwo views0.58
23
0.28
16
0.47
17
0.49
34
0.32
32
0.77
25
0.50
29
0.99
11
0.92
14
1.04
34
1.14
35
1.21
41
0.69
25
0.88
26
0.69
28
0.20
20
0.20
35
0.22
15
0.22
24
0.21
17
0.18
15
CBMV_ROBtwo views0.59
25
0.37
40
0.38
3
0.39
8
0.28
21
0.74
22
0.28
2
1.00
12
0.98
23
0.94
26
1.25
47
0.81
10
0.78
35
0.87
24
0.50
13
0.36
68
0.37
73
0.43
71
0.40
73
0.33
48
0.30
44
NVstereo2Dtwo views0.60
26
0.30
21
0.75
45
0.48
31
0.43
53
0.89
31
0.48
25
1.05
21
1.15
42
0.66
2
0.86
2
0.88
20
0.70
29
0.99
34
0.65
25
0.32
57
0.15
17
0.28
40
0.20
16
0.40
62
0.36
59
SGM-Foresttwo views0.62
27
0.32
23
0.39
4
0.43
21
0.30
23
1.02
43
0.55
35
1.09
25
1.05
34
1.06
37
1.18
41
1.10
35
0.67
22
0.88
26
0.62
23
0.31
50
0.33
68
0.33
59
0.30
50
0.30
39
0.28
38
Johannes L. Schönberger, Sudipta Sinha, Marc Pollefeys: Learning to Fuse Proposals from Multiple Scanline Optimizations in Semi-Global Matching. ECCV 2018
PSMNet_ROBtwo views0.64
28
0.37
40
0.65
33
0.56
54
0.42
50
0.97
37
0.83
56
1.21
42
0.92
14
0.79
9
1.02
16
1.08
33
0.63
19
0.82
18
0.91
44
0.29
45
0.20
35
0.31
51
0.31
54
0.26
29
0.23
30
CBMVpermissivetwo views0.64
28
0.35
35
0.41
8
0.42
16
0.25
12
1.01
41
0.77
54
1.13
29
1.09
39
1.17
45
1.22
44
1.04
30
0.75
33
0.88
26
0.63
24
0.28
41
0.30
60
0.34
61
0.30
50
0.26
29
0.26
36
Konstantinos Batsos, Changjiang Cai, Philippos Mordohai: CBMV: A Coalesced Bidirectional Matching Volume for Disparity Estimation. Computer Vision and Pattern Recognition (CVPR) 2018
StereoDRNettwo views0.64
28
0.33
31
0.68
37
0.51
42
0.43
53
1.03
45
0.53
33
1.47
68
0.99
25
1.10
39
0.87
3
1.07
32
0.71
31
0.82
18
0.88
38
0.23
26
0.19
31
0.25
29
0.23
31
0.27
34
0.19
20
DRN-Testtwo views0.66
31
0.32
23
0.66
35
0.50
37
0.39
40
1.15
53
0.61
38
1.49
69
1.01
30
1.12
41
1.08
26
1.06
31
0.69
25
0.77
15
0.94
45
0.23
26
0.17
25
0.26
33
0.25
40
0.25
26
0.21
23
PA-Nettwo views0.67
32
0.43
49
0.83
50
0.50
37
0.53
66
0.94
33
0.68
44
1.10
26
1.14
41
0.80
10
0.88
6
0.87
16
0.77
34
1.02
36
0.88
38
0.24
29
0.41
76
0.29
44
0.44
76
0.26
29
0.34
53
Zhibo Rao, Mingyi He, Yuchao Dai, Zhelun Shen: Patch Attention Network with Generative Adversarial Model for Semi-Supervised Binocular Disparity Prediction.
NaN_ROBtwo views0.69
33
0.44
53
0.69
40
0.51
42
0.33
35
0.97
37
1.06
72
1.22
44
1.19
44
1.24
52
0.96
11
1.08
33
0.78
35
1.20
45
0.67
27
0.24
29
0.30
60
0.23
22
0.24
37
0.23
22
0.25
33
ETE_ROBtwo views0.70
34
0.49
58
0.73
44
0.56
54
0.36
38
0.79
27
0.87
60
1.14
32
0.93
16
1.19
47
1.18
41
1.39
56
0.82
40
0.99
34
0.80
33
0.26
36
0.21
37
0.31
51
0.26
46
0.33
48
0.34
53
DISCOtwo views0.70
34
0.26
11
0.68
37
0.44
23
0.31
31
1.14
51
0.51
31
1.23
47
1.23
48
0.81
11
1.05
21
1.50
62
0.85
42
1.82
71
1.03
51
0.19
15
0.15
17
0.18
13
0.16
12
0.24
25
0.23
30
XPNet_ROBtwo views0.70
34
0.35
35
0.68
37
0.52
46
0.40
42
0.79
27
0.85
59
1.16
34
0.94
19
1.33
57
1.15
36
1.38
55
0.83
41
1.03
37
0.90
42
0.32
57
0.25
49
0.26
33
0.24
37
0.31
42
0.28
38
GANettwo views0.70
34
0.37
40
0.55
24
0.51
42
0.34
36
0.94
33
0.96
66
1.08
24
0.94
19
1.04
34
1.98
68
1.13
39
0.88
46
1.12
41
0.66
26
0.25
31
0.26
52
0.27
37
0.22
24
0.31
42
0.22
29
DANettwo views0.71
38
0.45
54
0.91
54
0.59
58
0.40
42
0.81
29
0.40
17
0.97
10
0.84
7
1.20
49
1.22
44
1.33
52
0.88
46
1.16
44
1.24
60
0.27
38
0.23
44
0.31
51
0.25
40
0.35
54
0.32
48
NCCL2two views0.72
39
0.41
46
0.65
33
0.64
66
0.44
56
0.98
40
1.20
75
1.11
27
1.00
28
0.98
29
0.96
11
1.49
61
0.87
45
0.90
30
0.79
32
0.31
50
0.26
52
0.39
68
0.39
71
0.32
45
0.32
48
RYNettwo views0.72
39
0.35
35
0.64
29
0.53
48
0.67
78
1.31
61
0.67
41
1.20
41
1.18
43
1.05
36
0.87
3
1.27
46
0.93
51
0.93
32
1.27
63
0.22
25
0.16
21
0.26
33
0.23
31
0.40
62
0.39
66
LALA_ROBtwo views0.74
41
0.43
49
0.69
40
0.54
51
0.40
42
1.01
41
0.95
65
1.21
42
1.07
36
1.22
50
1.12
32
1.56
66
0.79
37
1.04
38
0.86
36
0.33
59
0.22
38
0.35
63
0.30
50
0.35
54
0.31
46
Anonymous Stereotwo views0.75
42
0.54
69
1.78
75
0.57
56
0.56
70
0.62
9
1.25
77
0.96
9
0.99
25
0.96
28
0.95
10
0.83
12
0.58
13
1.27
49
1.21
59
0.31
50
0.30
60
0.31
51
0.33
60
0.33
48
0.36
59
stereogantwo views0.80
43
0.31
22
0.71
43
0.59
58
0.50
61
1.72
77
0.48
25
1.35
58
1.37
58
1.00
31
1.46
58
1.56
66
0.95
52
1.29
51
0.84
35
0.28
41
0.29
59
0.29
44
0.25
40
0.38
60
0.38
65
ADCReftwo views0.80
43
0.43
49
1.32
70
0.50
37
0.45
58
1.07
47
0.51
31
1.07
23
0.98
23
1.17
45
1.42
53
0.87
16
0.91
49
0.85
22
3.19
76
0.18
13
0.16
21
0.23
22
0.23
31
0.22
21
0.21
23
GANetREF_RVCpermissivetwo views0.80
43
0.75
79
0.86
53
0.66
70
0.42
50
1.15
53
0.98
67
1.14
32
1.19
44
1.01
32
1.02
16
1.10
35
0.90
48
1.15
43
0.76
31
0.48
78
0.39
75
0.57
82
0.38
68
0.63
83
0.46
74
Zhang, Feihu and Prisacariu, Victor and Yang, Ruigang and Torr, Philip HS: GA-Net: Guided Aggregation Net for End- to-end Stereo Matching. CVPR 2019
CSANtwo views0.83
46
0.49
58
0.77
47
0.60
63
0.41
47
1.14
51
1.06
72
1.19
39
1.46
65
1.32
55
1.43
56
1.37
54
1.19
61
1.27
49
0.86
36
0.39
71
0.31
64
0.36
65
0.37
65
0.35
54
0.34
53
PWC_ROBbinarytwo views0.87
47
0.52
66
1.28
67
0.50
37
0.32
32
0.91
32
0.37
12
1.36
59
1.46
65
1.61
65
2.59
79
1.16
40
1.01
55
1.34
54
1.40
67
0.25
31
0.18
29
0.28
40
0.20
16
0.29
36
0.28
38
RTSCtwo views0.87
47
0.48
55
1.23
65
0.52
46
0.37
39
1.13
50
0.50
29
1.34
56
1.79
76
1.51
60
1.24
46
1.12
38
0.81
39
1.62
67
2.16
74
0.25
31
0.19
31
0.23
22
0.24
37
0.30
39
0.31
46
FBW_ROBtwo views0.88
49
0.50
62
0.85
51
0.55
53
0.40
42
1.17
55
0.78
55
1.50
70
1.37
58
1.33
57
1.27
48
1.48
60
1.00
54
2.57
83
0.98
48
0.25
31
0.32
65
0.45
74
0.30
50
0.34
51
0.28
38
PWCDC_ROBbinarytwo views0.89
50
0.49
58
0.80
49
0.74
78
0.40
42
1.03
45
0.39
16
1.33
55
2.35
83
1.02
33
3.77
83
0.81
10
0.92
50
1.20
45
0.90
42
0.34
63
0.22
38
0.24
27
0.20
16
0.34
51
0.29
42
XQCtwo views0.89
50
0.65
76
1.31
69
0.67
72
0.52
65
1.22
56
0.68
44
1.26
53
1.52
68
1.14
43
1.11
30
1.28
47
1.04
56
1.40
60
1.77
70
0.38
69
0.24
46
0.37
66
0.31
54
0.53
79
0.47
76
DeepPrunerFtwo views0.89
50
0.52
66
3.43
85
0.68
73
0.81
80
0.72
20
0.72
49
1.22
44
1.99
78
0.93
25
1.01
15
0.87
16
0.86
43
1.08
40
0.95
46
0.34
63
0.28
56
0.43
71
0.39
71
0.30
39
0.32
48
PASMtwo views0.90
53
0.64
75
1.69
73
0.69
75
0.69
79
0.78
26
0.84
57
1.03
17
1.26
53
1.19
47
1.15
36
1.31
48
0.98
53
1.24
47
1.03
51
0.52
81
0.56
84
0.59
84
0.66
83
0.56
82
0.54
81
MDST_ROBtwo views0.91
54
0.27
14
0.85
51
0.63
65
0.41
47
2.53
83
0.71
48
1.60
75
1.13
40
2.83
83
1.73
63
0.97
25
0.69
25
1.61
66
0.73
29
0.25
31
0.22
38
0.32
55
0.28
47
0.25
26
0.24
32
PDISCO_ROBtwo views0.91
54
0.48
55
1.06
58
0.99
84
0.97
86
1.67
76
0.63
40
1.64
76
1.59
73
1.06
37
1.18
41
1.31
48
0.79
37
1.55
62
0.97
47
0.49
79
0.22
38
0.45
74
0.40
73
0.45
71
0.37
61
SHDtwo views0.93
56
0.50
62
1.18
62
0.59
58
0.46
59
0.97
37
0.48
25
1.70
78
2.15
79
1.58
62
1.37
52
1.42
58
1.19
61
1.31
52
1.60
69
0.31
50
0.26
52
0.32
55
0.33
60
0.40
62
0.42
69
SGM_RVCbinarytwo views0.97
57
0.28
16
0.41
8
0.39
8
0.22
6
1.55
69
0.73
50
1.51
71
1.25
52
2.25
78
1.92
66
2.58
80
1.56
73
2.02
78
1.19
58
0.27
38
0.27
55
0.27
37
0.25
40
0.25
26
0.25
33
Heiko Hirschmueller: Stereo processing by semiglobal matching and mutual information. TPAMI 2008, Volume 30(2), pp. 328-341
DPSNettwo views0.99
58
0.42
48
1.88
76
0.53
48
0.41
47
1.59
72
1.26
78
2.21
85
1.43
62
1.10
39
1.11
30
1.46
59
1.95
80
1.34
54
1.26
61
0.34
63
0.25
49
0.25
29
0.21
19
0.46
72
0.37
61
ADCPNettwo views1.03
59
0.48
55
2.70
80
0.54
51
0.42
50
1.40
62
0.75
52
1.13
29
1.07
36
1.22
50
1.42
53
1.68
68
1.09
58
1.36
57
3.45
77
0.30
49
0.28
56
0.29
44
0.32
58
0.37
57
0.35
58
ADCLtwo views1.04
60
0.38
43
1.35
71
0.50
37
0.39
40
1.61
73
0.89
61
1.29
54
1.39
61
1.29
53
1.50
59
1.24
44
1.24
64
1.37
58
4.87
84
0.21
22
0.19
31
0.25
29
0.25
40
0.29
36
0.27
37
ADCP+two views1.06
61
0.35
35
1.51
72
0.49
34
0.51
63
1.45
66
0.55
35
1.25
52
1.03
33
1.16
44
1.15
36
1.10
35
1.07
57
1.56
64
6.75
90
0.19
15
0.16
21
0.23
22
0.21
19
0.28
35
0.21
23
SANettwo views1.07
62
0.41
46
1.06
58
0.48
31
0.30
23
1.41
63
0.94
64
1.42
64
3.36
89
1.58
62
2.43
74
2.33
78
1.70
75
1.34
54
1.07
54
0.28
41
0.24
46
0.26
33
0.23
31
0.34
51
0.30
44
SAMSARAtwo views1.07
62
0.65
76
1.19
63
1.06
87
0.90
82
1.76
78
1.76
88
1.46
67
1.44
63
1.51
60
1.16
40
2.00
73
1.14
59
1.48
61
1.31
65
0.39
71
0.44
80
0.35
63
0.38
68
0.49
75
0.49
78
ADCMidtwo views1.14
64
0.52
66
2.23
78
0.53
48
0.43
53
1.07
47
0.75
52
1.23
47
1.05
34
2.15
76
1.44
57
1.50
62
1.43
70
1.56
64
5.01
85
0.26
36
0.24
46
0.29
44
0.31
54
0.37
57
0.33
52
MFMNet_retwo views1.14
64
0.97
86
1.19
63
0.99
84
0.95
85
1.25
59
1.22
76
1.39
61
1.55
71
1.69
69
1.67
62
1.21
41
1.53
71
1.12
41
0.99
49
0.87
87
0.75
88
0.79
86
0.75
85
0.90
86
0.94
86
WCMA_ROBtwo views1.14
64
0.36
39
0.76
46
0.49
34
0.44
56
1.23
57
0.67
41
1.11
27
1.23
48
2.84
84
3.95
87
3.28
85
1.83
78
1.25
48
0.99
49
0.44
74
0.34
70
0.34
61
0.37
65
0.41
67
0.40
67
AnyNet_C32two views1.16
67
0.62
74
2.40
79
0.57
56
0.55
68
1.41
63
1.34
80
1.18
38
1.24
50
1.49
59
1.42
53
1.39
56
1.20
63
1.55
62
4.82
83
0.29
45
0.28
56
0.29
44
0.34
62
0.38
60
0.42
69
MSMD_ROBtwo views1.21
68
0.49
58
0.62
27
0.59
58
0.50
61
1.62
74
0.67
41
1.23
47
1.24
50
2.45
80
3.77
83
3.09
84
2.98
86
1.32
53
0.81
34
0.45
76
0.41
76
0.49
78
0.51
81
0.47
73
0.44
72
pmcnntwo views1.23
69
0.29
18
0.96
56
0.40
11
0.23
9
0.95
35
0.69
46
1.16
34
1.28
54
1.67
68
2.33
70
11.15
94
0.86
43
0.93
32
0.88
38
0.15
3
0.12
4
0.12
1
0.10
2
0.15
4
0.15
5
NVStereoNet_ROBtwo views1.24
70
0.90
81
1.13
61
0.86
82
0.88
81
1.11
49
0.99
68
1.42
64
1.53
69
1.59
64
2.48
76
2.24
77
1.53
71
1.96
77
1.30
64
0.76
86
0.70
86
0.78
85
1.00
90
0.71
84
0.88
84
Nikolai Smolyanskiy, Alexey Kamenev, Stan Birchfield: On the Importance of Stereo for Accurate Depth Estimation: An Efficient Semi-Supervised Deep Neural Network Approach. Arxiv
SPS-STEREOcopylefttwo views1.25
71
0.93
82
1.06
58
1.04
86
1.03
88
1.41
63
0.91
63
1.51
71
1.19
44
2.09
75
1.83
65
1.94
72
1.39
68
1.62
67
1.39
66
0.97
88
0.94
89
0.90
87
0.89
87
0.96
88
0.96
87
K. Yamaguchi, D. McAllester, R. Urtasun: Efficient Joint Segmentation, Occlusion Labeling, Stereo and Flow Estimation. ECCV 2014
PVDtwo views1.28
72
0.67
78
1.28
67
0.74
78
0.62
76
1.50
67
0.84
57
1.99
82
2.96
86
2.18
77
2.36
72
1.85
70
1.95
80
1.92
73
1.94
71
0.39
71
0.36
72
0.49
78
0.37
65
0.51
77
0.64
82
MeshStereopermissivetwo views1.34
73
0.43
49
0.54
21
0.44
23
0.34
36
1.66
75
0.60
37
1.94
80
1.37
58
4.47
88
3.25
81
4.71
88
1.94
79
1.92
73
1.17
57
0.35
67
0.35
71
0.37
66
0.31
54
0.32
45
0.32
48
C. Zhang, Z. Li, Y. Cheng, R. Cai, H. Chao, Y. Rui: MeshStereo: A Global Stereo Model with Mesh Alignment Regularization for View Interpolation. ICCV 2015
SGM+DAISYtwo views1.35
74
0.94
83
1.25
66
0.96
83
1.00
87
1.52
68
1.02
71
1.34
56
1.21
47
2.64
82
2.65
80
2.44
79
1.56
73
1.64
70
1.26
61
0.97
88
0.95
90
0.90
87
0.90
88
0.94
87
0.96
87
Abc-Nettwo views1.35
74
0.61
70
1.88
76
0.78
81
0.62
76
2.49
82
1.30
79
2.99
87
1.57
72
1.61
65
2.37
73
2.14
75
2.89
84
1.95
76
1.41
68
0.44
74
0.30
60
0.40
70
0.38
68
0.42
68
0.41
68
ADCStwo views1.51
76
0.61
70
3.42
84
0.59
58
0.51
63
1.27
60
1.08
74
1.59
74
1.81
77
1.91
73
1.66
61
1.51
64
1.42
69
1.94
75
8.60
91
0.33
59
0.32
65
0.33
59
0.36
64
0.44
69
0.42
69
AnyNet_C01two views1.56
77
1.05
87
7.45
88
0.64
66
0.54
67
1.99
79
1.38
81
1.43
66
1.36
57
1.63
67
1.92
66
1.84
69
1.33
67
2.40
81
4.03
79
0.33
59
0.32
65
0.32
55
0.34
62
0.44
69
0.45
73
FC-DCNNcopylefttwo views1.59
78
0.51
64
0.66
35
0.61
64
0.49
60
1.56
70
0.70
47
1.66
77
1.53
69
4.64
89
4.62
88
5.85
90
3.34
87
1.84
72
1.11
56
0.46
77
0.42
78
0.47
76
0.46
77
0.48
74
0.46
74
ELAS_RVCcopylefttwo views1.60
79
0.61
70
1.01
57
0.70
76
0.56
70
2.00
80
1.89
89
1.96
81
2.65
85
3.33
85
3.26
82
3.03
83
3.41
88
2.49
82
2.09
73
0.52
81
0.45
81
0.50
80
0.48
79
0.54
80
0.52
79
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
DispFullNettwo views1.60
79
2.63
92
2.75
81
2.56
92
1.79
90
1.24
58
0.47
24
1.37
60
1.45
64
1.87
70
1.50
59
1.55
65
2.46
83
2.04
79
1.04
53
0.59
84
0.18
29
2.32
91
0.68
84
2.29
91
1.14
89
ELAScopylefttwo views1.63
81
0.61
70
0.95
55
0.68
73
0.55
68
3.01
89
1.55
86
2.39
86
2.33
82
3.64
86
3.89
86
2.75
81
2.90
85
2.15
80
2.27
75
0.52
81
0.45
81
0.50
80
0.48
79
0.54
80
0.52
79
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
Nwc_Nettwo views1.71
82
0.38
43
1.70
74
0.74
78
0.61
75
2.04
81
1.61
87
3.36
89
1.51
67
1.29
53
2.44
75
2.11
74
2.11
82
3.35
85
9.16
92
0.34
63
0.25
49
0.28
40
0.22
24
0.32
45
0.34
53
PWCKtwo views1.92
83
1.66
88
3.12
83
1.65
89
0.91
83
2.84
87
2.26
90
2.10
83
2.41
84
2.36
79
2.31
69
2.22
76
1.82
77
3.43
86
2.03
72
1.61
91
0.71
87
1.51
90
0.82
86
1.69
89
0.91
85
RTStwo views1.98
84
0.96
84
11.36
92
0.65
68
0.57
73
2.77
85
1.40
82
1.39
61
2.26
80
1.89
71
2.56
77
1.32
50
1.25
65
5.20
88
4.05
80
0.31
50
0.22
38
0.29
44
0.29
48
0.40
62
0.37
61
RTSAtwo views1.98
84
0.96
84
11.36
92
0.65
68
0.57
73
2.77
85
1.40
82
1.39
61
2.26
80
1.89
71
2.56
77
1.32
50
1.25
65
5.20
88
4.05
80
0.31
50
0.22
38
0.29
44
0.29
48
0.40
62
0.37
61
MADNet+two views2.23
86
2.43
90
10.85
91
1.22
88
0.91
83
2.76
84
1.40
82
2.12
84
1.67
75
1.32
55
1.74
64
1.86
71
1.74
76
6.25
90
4.41
82
0.65
85
0.64
85
0.58
83
0.54
82
0.80
85
0.76
83
edge stereotwo views3.57
87
1.98
89
5.99
87
1.80
90
1.78
89
9.25
93
10.12
93
5.03
92
3.09
87
2.60
81
4.88
89
3.01
82
5.43
91
3.27
84
3.55
78
1.34
90
1.39
91
1.32
89
1.32
91
2.19
90
2.00
90
DPSimNet_ROBtwo views3.58
88
2.45
91
8.57
89
2.53
91
2.37
91
2.88
88
3.77
92
3.14
88
3.23
88
4.71
90
3.86
85
4.68
87
4.87
90
4.07
87
5.62
86
2.33
92
2.33
92
2.47
92
2.52
92
2.51
92
2.64
91
MADNet++two views4.03
89
2.70
93
3.44
86
3.72
93
3.44
92
4.44
90
3.52
91
3.95
91
3.61
90
4.46
87
4.98
91
4.10
86
4.58
89
7.00
91
5.89
87
3.74
93
3.67
93
3.27
93
2.71
93
3.70
93
3.66
92
SGM-ForestMtwo views4.04
90
0.32
23
0.46
15
0.43
21
0.27
18
7.50
92
1.45
85
3.84
90
4.24
91
8.19
94
6.76
92
19.00
95
9.08
92
10.74
95
6.62
89
0.31
50
0.33
68
0.32
55
0.32
58
0.29
36
0.29
42
MANEtwo views4.13
91
0.51
64
0.64
29
0.66
70
0.56
70
6.89
91
0.99
68
7.55
93
12.04
92
6.68
93
8.57
93
10.69
93
9.45
93
7.59
92
6.39
88
0.49
79
0.43
79
0.48
77
0.98
89
0.49
75
0.48
77
LE_ROBtwo views4.42
92
0.24
6
2.75
81
0.42
16
0.23
9
1.57
71
0.90
62
1.71
79
17.88
97
16.56
97
4.88
89
5.17
89
18.85
98
1.63
69
14.62
93
0.17
9
0.14
15
0.16
10
0.16
12
0.19
13
0.17
11
LSMtwo views4.74
93
0.77
80
10.16
90
0.70
76
56.56
99
1.02
43
1.00
70
1.24
51
1.62
74
2.04
74
2.34
71
1.33
52
1.17
60
1.38
59
1.09
55
0.33
59
0.49
83
0.39
68
0.46
77
0.51
77
10.09
93
DGTPSM_ROBtwo views12.07
94
8.00
94
20.99
94
8.93
94
16.78
93
10.67
94
30.87
96
9.01
94
15.16
93
6.50
91
15.41
96
9.04
91
14.85
94
9.25
93
21.67
94
4.57
94
8.57
94
5.08
94
9.28
94
5.90
94
10.95
94
DPSMNet_ROBtwo views12.08
95
8.00
94
21.00
95
8.97
95
16.79
94
10.67
94
30.89
97
9.02
95
15.17
94
6.51
92
15.41
96
9.04
91
14.85
94
9.27
94
21.67
94
4.58
95
8.57
94
5.09
95
9.29
95
5.90
94
10.95
94
DPSM_ROBtwo views17.78
96
18.63
96
24.31
96
20.90
96
19.47
95
25.97
96
36.21
98
21.23
96
16.24
95
13.38
95
14.06
94
19.18
96
16.30
96
20.78
96
25.67
96
9.38
96
9.23
96
9.49
96
9.82
96
13.51
96
11.91
96
DPSMtwo views17.78
96
18.63
96
24.31
96
20.90
96
19.47
95
25.97
96
36.21
98
21.23
96
16.24
95
13.38
95
14.06
94
19.18
96
16.30
96
20.78
96
25.67
96
9.38
96
9.23
96
9.49
96
9.82
96
13.51
96
11.91
96
MEDIAN_ROBtwo views37.38
98
40.85
100
40.60
99
32.31
98
31.85
97
27.20
98
24.55
94
29.10
98
33.19
99
42.09
100
41.76
100
33.35
98
34.39
100
39.78
99
37.07
98
43.64
100
44.10
100
44.12
100
43.40
100
41.51
100
42.76
100
AVERAGE_ROBtwo views37.58
99
40.13
99
40.15
98
34.81
99
33.82
98
28.55
99
25.10
95
32.02
99
34.06
100
41.28
99
40.66
99
34.46
99
34.81
101
39.18
98
37.57
99
42.78
99
43.44
99
43.82
99
43.03
99
40.47
99
41.51
99
LSM0two views38.95
100
37.60
98
48.58
100
43.57
100
91.06
100
52.57
100
72.57
100
43.17
100
32.63
98
26.97
98
27.97
98
38.65
100
32.94
99
41.86
100
51.78
100
18.89
98
18.41
98
19.13
98
19.63
98
27.29
98
33.77
98
MSMDNettwo views0.43
3