This table lists the benchmark results for the low-res two-view scenario. This benchmark evaluates the Middlebury stereo metrics (for all metrics, smaller is better):

The mask determines whether the metric is evaluated for all pixels with ground truth, or only for pixels which are visible in both images (non-occluded).
The coverage selector allows to limit the table to results for all pixels (dense), or a given minimum fraction of pixels.

Methods with suffix _ROB may participate in the Robust Vision Challenge.

Click one or more dataset result cells or column headers to show visualizations. Most visualizations are only available for training datasets. The visualizations may not work with mobile browsers.




Method Infoalldeliv. area 1ldeliv. area 1sdeliv. area 2ldeliv. area 2sdeliv. area 3ldeliv. area 3select. 1lelect. 1select. 2lelect. 2select. 3lelect. 3sfacade 1sforest 1sforest 2splayg. 1lplayg. 1splayg. 2lplayg. 2splayg. 3lplayg. 3sterra. 1sterra. 2sterra. 1lterra. 1sterra. 2lterra. 2s
sorted bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort bysort by
TDLMtwo views0.09
1
0.08
4
0.12
6
0.02
1
0.02
1
0.00
1
0.01
4
0.19
9
0.04
2
0.00
1
0.00
1
0.48
15
0.01
1
0.01
2
0.26
1
0.22
1
0.08
1
0.06
5
0.01
1
0.00
1
0.10
1
0.10
4
0.01
7
0.01
1
0.00
1
0.01
1
0.30
1
0.39
11
delettwo views0.19
2
0.07
3
0.25
9
0.06
4
0.04
3
0.02
20
0.02
11
0.01
1
0.04
2
0.34
10
0.00
1
0.10
3
0.03
3
0.02
5
1.36
5
0.46
4
0.16
6
0.05
3
0.13
3
0.05
6
0.13
3
0.18
6
0.00
1
0.04
6
0.00
1
0.01
1
1.41
10
0.02
1
CVANet_RVCtwo views0.21
3
0.15
5
0.61
28
0.42
29
0.06
5
0.06
28
0.01
4
1.53
35
0.05
6
0.08
4
0.04
12
0.62
22
0.04
5
0.03
7
0.46
2
0.32
2
0.14
4
0.09
8
0.05
2
0.01
2
0.19
4
0.07
3
0.00
1
0.03
4
0.00
1
0.01
1
0.55
3
0.08
4
AANet_RVCtwo views0.25
4
0.05
2
0.29
11
0.06
4
0.19
10
0.00
1
0.23
34
0.03
5
0.17
14
0.35
13
0.01
11
0.07
2
0.04
5
0.01
2
1.37
6
0.48
5
0.10
2
0.06
5
0.43
8
0.17
13
0.10
1
0.25
8
0.03
14
0.05
7
0.02
4
0.06
7
1.15
5
0.89
23
DMCAtwo views0.27
5
0.37
15
0.20
8
0.17
9
0.09
7
0.02
20
0.01
4
0.02
4
0.04
2
0.59
19
0.00
1
0.34
10
0.05
8
0.01
2
1.81
7
0.88
9
0.31
12
0.12
9
0.22
4
0.10
8
0.38
9
0.23
7
0.00
1
0.03
4
0.03
7
0.06
7
1.19
6
0.15
6
GANettwo views0.29
6
0.56
28
0.39
19
0.27
16
0.11
8
0.88
47
0.14
29
0.27
13
0.06
7
0.02
2
0.28
18
1.85
39
0.04
5
0.03
7
0.62
3
0.39
3
0.37
18
0.14
14
0.24
5
0.01
2
0.20
5
0.05
2
0.06
21
0.02
2
0.06
10
0.01
1
0.48
2
0.16
7
MMNettwo views0.30
7
0.21
8
0.07
3
0.05
3
0.05
4
0.00
1
0.06
22
0.01
1
0.03
1
0.22
8
0.00
1
0.19
5
0.01
1
0.05
13
2.24
10
0.62
7
0.24
9
0.12
9
0.35
6
0.07
7
0.30
7
0.11
5
0.04
15
0.08
10
0.03
7
0.04
6
2.84
20
0.02
1
UPFNettwo views0.44
8
0.29
9
0.59
26
0.21
11
0.26
15
0.01
9
0.02
11
0.08
7
0.08
8
0.39
14
0.00
1
0.27
8
0.03
3
0.24
41
3.08
24
0.92
11
0.14
4
0.12
9
0.57
9
0.31
20
0.36
8
0.39
9
0.01
7
0.12
15
0.02
4
0.01
1
3.44
32
0.05
3
RYNettwo views0.45
9
0.04
1
0.00
1
0.02
1
0.03
2
0.00
1
0.00
1
0.01
1
0.04
2
0.07
3
0.00
1
0.05
1
0.12
14
0.00
1
4.77
51
0.60
6
0.10
2
1.00
53
0.35
6
0.01
2
0.21
6
0.04
1
0.00
1
0.02
2
0.08
12
3.06
40
1.39
9
0.13
5
UNettwo views0.56
10
0.30
10
0.39
19
0.26
15
0.19
10
0.03
23
0.08
24
0.52
18
0.13
12
1.15
35
0.00
1
0.44
14
0.16
19
0.16
34
2.13
8
1.28
24
0.34
13
0.20
16
1.42
22
0.19
14
0.62
12
0.50
10
0.13
31
0.24
21
0.03
7
0.26
14
3.32
28
0.62
17
CFNettwo views0.57
11
0.46
21
0.64
29
0.43
30
0.34
25
0.04
26
0.00
1
0.58
20
0.09
9
0.66
21
0.05
13
0.64
23
0.07
9
0.05
13
2.25
11
0.65
8
0.35
14
0.06
5
0.73
12
0.34
22
0.85
17
0.73
11
0.00
1
0.08
10
0.45
26
0.16
11
3.22
27
1.51
31
HGLStereotwo views0.58
12
0.32
14
0.38
18
0.23
12
0.26
15
0.01
9
0.06
22
0.06
6
0.12
10
1.22
38
0.70
34
0.14
4
0.15
18
0.09
25
2.33
12
1.33
26
0.37
18
0.24
20
1.13
17
0.15
10
0.97
18
1.51
21
0.06
21
0.07
8
0.08
12
0.08
10
3.08
22
0.47
13
HITNettwo views0.59
13
0.16
6
0.10
5
0.06
4
0.07
6
0.01
9
0.02
11
1.76
39
0.82
31
0.11
5
0.00
1
0.68
26
0.28
26
0.11
30
2.51
16
1.02
14
0.25
10
0.13
12
0.63
10
0.24
18
1.43
28
1.56
24
0.01
7
0.12
15
0.02
4
0.16
11
3.32
28
0.42
12
Vladimir Tankovich, Christian Häne, Yinda Zhang, Adarsh Kowdle, Sean Fanello, Sofien Bouaziz: HITNet: Hierarchical Iterative Tile Refinement Network for Real-time Stereo Matching. CVPR 2021
RASNettwo views0.63
14
0.18
7
0.57
24
0.41
28
0.28
18
0.10
31
0.05
21
0.94
26
0.14
13
1.07
34
0.09
15
0.48
15
0.34
30
0.02
5
1.28
4
1.47
33
0.35
14
0.21
17
0.99
14
0.02
5
1.15
23
1.53
22
0.10
27
0.34
35
0.22
20
0.30
16
3.55
36
0.73
19
DMCA-RVCcopylefttwo views0.65
15
0.42
17
0.43
22
0.31
20
0.58
31
0.15
35
0.44
37
0.29
14
1.57
50
1.31
40
0.35
22
0.55
19
0.13
15
0.11
30
2.54
17
1.10
18
0.51
22
0.46
33
0.84
13
0.36
25
0.76
15
0.83
12
0.19
38
0.14
19
0.11
15
0.06
7
2.59
17
0.52
15
DSFCAtwo views0.70
16
0.43
18
0.51
23
0.29
19
0.58
31
0.04
26
0.59
40
0.22
11
0.54
25
0.49
18
0.56
27
0.33
9
0.08
12
0.04
10
2.86
18
1.28
24
0.67
33
0.73
39
1.00
15
0.44
31
1.01
21
1.14
18
0.21
41
0.24
21
0.54
27
0.82
26
2.81
19
0.47
13
DISCOtwo views0.79
17
0.95
44
0.31
13
0.06
4
0.33
23
0.51
41
0.44
37
0.23
12
0.32
20
1.75
50
0.59
30
0.38
13
0.29
27
0.04
10
2.98
22
1.36
27
0.96
42
0.29
24
3.24
39
0.53
35
0.79
16
1.45
19
0.10
27
0.07
8
0.12
17
0.29
15
2.76
18
0.31
10
NVstereo2Dtwo views0.83
18
0.43
18
0.07
3
0.23
12
0.36
26
0.41
40
0.08
24
1.61
37
0.59
28
0.84
26
1.11
39
0.36
11
0.21
24
0.07
20
5.88
62
2.16
50
0.62
27
0.45
32
1.50
23
0.19
14
1.20
24
0.94
17
0.02
10
0.32
31
1.30
38
0.24
13
0.95
4
0.24
9
CFNet_RVCtwo views0.86
19
0.59
30
0.74
34
0.33
21
0.32
22
0.11
32
0.02
11
0.11
8
0.17
14
0.87
29
0.74
35
0.57
21
0.16
19
0.08
24
3.11
25
1.19
23
0.50
21
0.34
26
1.28
20
0.15
10
1.68
33
1.48
20
0.02
10
0.25
23
0.86
34
1.89
35
3.64
38
2.13
35
DLCB_ROBtwo views0.92
20
0.72
36
0.64
29
0.43
30
0.42
27
0.14
34
0.18
32
0.43
15
0.28
18
0.85
27
0.53
25
1.12
34
0.72
51
0.07
20
2.16
9
1.12
19
0.88
40
0.26
21
0.69
11
0.20
16
1.35
27
1.53
22
0.49
58
0.28
26
4.11
66
2.41
37
2.04
13
0.78
20
HSMtwo views0.97
21
0.30
10
0.87
38
0.54
36
0.46
29
0.00
1
0.01
4
2.93
61
0.30
19
0.13
6
0.56
27
0.65
24
0.13
15
0.05
13
2.97
20
1.08
17
0.66
32
0.17
15
3.26
41
0.35
23
1.78
34
2.03
30
0.04
15
0.33
34
0.38
23
0.81
25
4.16
51
1.10
25
NLCA_NET_v2_RVCtwo views0.99
22
0.82
39
0.34
16
0.18
10
0.43
28
0.03
23
1.10
52
1.20
30
2.50
63
2.06
55
0.11
16
0.69
27
0.07
9
0.54
53
3.65
36
0.91
10
0.19
8
0.22
18
1.65
27
0.15
10
0.73
14
0.92
16
0.08
24
0.18
20
0.11
15
3.29
45
2.58
16
2.09
34
Zhibo Rao, Mingyi He, Yuchao Dai, Zhidong Zhu, Bo Li, and Renjie He.: NLCA-Net: A non-local context attention network for stereo matching.
psm_uptwo views1.01
23
0.60
31
0.96
43
0.38
26
0.75
35
0.01
9
0.03
18
2.33
49
1.18
40
0.78
24
0.00
1
0.94
31
0.09
13
0.45
51
2.41
13
1.18
22
0.62
27
0.44
31
3.67
44
0.32
21
1.14
22
0.85
14
0.00
1
0.58
46
1.06
36
1.42
32
3.75
44
1.44
29
PDISCO_ROBtwo views1.03
24
0.55
27
0.06
2
0.67
38
0.28
18
0.00
1
0.21
33
0.71
22
0.49
24
1.28
39
0.99
38
0.25
7
0.32
29
0.19
36
3.02
23
2.03
47
0.69
34
1.42
65
2.26
29
0.58
38
2.32
39
3.22
50
0.14
32
0.27
25
1.36
39
0.37
17
3.84
45
0.23
8
CREStereotwo views1.03
24
0.31
12
0.74
34
0.24
14
0.16
9
0.01
9
0.04
20
2.66
58
0.55
26
0.63
20
0.56
27
0.71
28
0.59
45
0.05
13
2.97
20
3.31
67
0.60
24
0.43
30
2.12
28
0.57
37
3.00
51
2.30
36
0.06
21
0.12
15
1.18
37
0.53
23
2.42
15
0.88
21
STTStereotwo views1.06
26
0.85
42
0.57
24
0.36
25
0.29
20
0.08
30
0.64
42
0.43
15
1.56
48
1.40
46
0.26
17
1.95
40
0.07
9
0.29
44
4.26
39
1.06
15
0.25
10
0.23
19
1.16
18
0.14
9
1.46
30
0.84
13
0.12
30
0.10
14
0.07
11
1.81
34
3.51
35
4.84
50
iResNet_ROBtwo views1.07
27
0.65
34
0.30
12
0.45
32
0.30
21
0.01
9
0.02
11
3.10
62
0.12
10
1.34
41
0.00
1
0.66
25
0.35
31
0.04
10
4.37
43
1.65
38
0.54
23
0.49
35
5.83
54
0.29
19
1.30
26
0.87
15
0.20
40
0.26
24
0.16
18
0.84
27
3.74
42
1.12
26
AF-Nettwo views1.11
28
0.51
23
0.87
38
0.34
23
0.78
37
0.89
48
1.87
57
2.40
50
1.15
38
0.48
16
0.68
32
3.66
49
0.16
19
0.27
42
2.44
14
0.98
12
0.35
14
0.01
1
1.60
24
0.40
26
0.50
10
2.58
40
0.05
19
0.08
10
0.66
31
0.45
19
3.47
33
2.34
36
Nwc_Nettwo views1.11
28
0.51
23
0.87
38
0.34
23
0.78
37
0.89
48
1.87
57
2.40
50
1.15
38
0.48
16
0.68
32
3.66
49
0.16
19
0.27
42
2.44
14
0.98
12
0.35
14
0.01
1
1.60
24
0.40
26
0.50
10
2.58
40
0.05
19
0.08
10
0.66
31
0.45
19
3.47
33
2.34
36
iResNetv2_ROBtwo views1.26
30
0.75
37
0.32
14
1.29
55
0.33
23
0.02
20
0.02
11
3.19
63
0.20
16
2.00
54
0.29
19
2.24
43
0.41
35
0.06
18
3.64
35
1.45
32
0.75
37
0.13
12
5.77
52
0.43
30
2.86
47
1.71
26
0.16
35
0.37
38
0.38
23
0.47
21
4.08
49
0.72
18
PMTNettwo views1.28
31
0.58
29
0.96
43
0.28
18
0.22
14
0.01
9
0.16
30
0.50
17
0.57
27
0.72
23
0.35
22
0.54
18
0.48
40
0.09
25
3.27
29
3.32
68
0.72
36
0.39
28
6.17
55
0.35
23
3.67
56
2.41
38
0.09
25
0.12
15
3.12
62
1.00
30
1.23
7
3.32
42
RPtwo views1.29
32
1.06
46
2.32
74
0.84
42
1.16
42
0.39
39
1.53
54
1.89
43
1.68
52
0.29
9
2.21
48
2.25
44
0.13
15
0.40
49
3.31
30
1.07
16
0.61
25
0.05
3
1.61
26
0.40
26
0.65
13
3.05
47
0.11
29
0.29
27
0.29
21
0.92
28
4.24
52
2.04
33
BEATNet_4xtwo views1.40
33
0.54
26
0.27
10
0.13
8
0.27
17
0.01
9
0.10
26
1.80
40
1.74
54
0.41
15
0.53
25
0.82
30
0.59
45
0.54
53
4.30
40
2.92
58
1.87
58
1.84
73
1.39
21
0.71
41
4.68
63
6.21
79
0.02
10
0.81
50
0.21
19
0.40
18
3.69
39
0.96
24
DN-CSS_ROBtwo views1.45
34
0.52
25
1.11
49
0.81
41
0.69
34
0.00
1
0.03
18
2.86
60
0.35
21
1.50
47
3.76
63
0.73
29
0.36
33
0.07
20
2.89
19
1.44
30
1.08
46
0.79
42
8.61
66
0.23
17
3.25
53
1.97
28
0.02
10
0.66
47
0.10
14
0.52
22
3.12
26
1.56
32
DeepPruner_ROBtwo views1.57
35
0.37
15
0.71
32
0.50
34
0.55
30
0.37
38
1.60
55
1.74
38
0.84
33
1.37
43
2.40
50
0.49
17
0.51
43
0.10
29
5.38
57
2.16
50
1.10
47
1.08
56
2.28
30
0.40
26
4.83
67
3.42
51
0.66
67
0.34
35
0.57
30
5.04
60
3.07
21
0.53
16
FCDSN-DCtwo views1.58
36
0.84
40
2.05
69
1.66
59
1.43
51
1.08
54
0.99
47
0.72
23
0.44
23
1.18
37
0.31
20
1.45
36
0.49
42
0.07
20
6.05
68
1.96
45
2.13
61
1.97
76
1.16
18
0.44
31
2.39
43
2.62
42
0.49
58
3.18
74
1.72
43
3.24
44
1.34
8
1.27
27
Dominik Hirner, Friedrich Fraundorfer: FCDSN-DC: An accurate but lightweight end-to-end trainable neural network for stereo estimation with depth completion.
FADNet-RVC-Resampletwo views1.60
37
0.67
35
0.97
45
0.50
34
0.64
33
0.18
36
1.05
49
1.27
32
0.83
32
0.96
31
0.97
37
5.23
57
0.42
36
1.02
77
4.81
54
1.68
39
1.05
44
1.96
75
3.78
45
1.06
51
1.52
31
2.28
35
0.21
41
1.35
59
0.55
28
3.16
43
2.13
14
2.92
39
PA-Nettwo views1.61
38
0.81
38
1.43
59
0.27
16
1.99
63
0.12
33
0.34
35
0.19
9
1.90
56
0.85
27
0.41
24
0.24
6
0.20
23
0.78
69
4.61
49
1.44
30
0.17
7
0.26
21
1.08
16
0.53
35
1.44
29
3.18
49
0.65
65
0.44
40
0.99
35
7.71
75
1.78
12
9.78
64
Zhibo Rao, Mingyi He, Yuchao Dai, Zhelun Shen: Patch Attention Network with Generative Adversarial Model for Semi-Supervised Binocular Disparity Prediction.
FADNet-RVCtwo views1.64
39
0.31
12
0.64
29
0.33
21
1.19
45
0.01
9
2.53
68
0.57
19
0.22
17
1.03
33
0.32
21
6.04
61
0.29
27
0.32
47
4.47
46
1.90
44
0.62
27
0.78
41
6.45
58
1.80
64
1.85
35
2.02
29
0.19
38
0.52
44
1.78
44
1.74
33
3.10
23
3.15
40
NCC-stereotwo views1.86
40
1.63
53
1.33
55
0.95
49
1.22
47
0.80
44
2.29
64
2.62
56
2.93
66
0.34
10
2.87
58
2.16
41
1.17
55
0.21
37
3.63
32
1.13
20
0.63
30
0.96
51
2.79
35
0.75
42
0.98
19
2.96
45
0.36
51
0.32
31
2.31
50
4.50
55
3.69
39
4.78
47
Abc-Nettwo views1.86
40
1.63
53
1.33
55
0.95
49
1.22
47
0.80
44
2.29
64
2.62
56
2.93
66
0.34
10
2.87
58
2.16
41
1.17
55
0.21
37
3.63
32
1.13
20
0.63
30
0.96
51
2.79
35
0.75
42
0.98
19
2.96
45
0.36
51
0.32
31
2.31
50
4.50
55
3.69
39
4.78
47
Xing Li, Yangyu Fan, Guoyun Lv, and Haoyue Ma: Area-based Correlation and Non-local Attention Network for Stereo Matching. The Visual Computer
stereogantwo views1.89
42
2.00
62
1.17
51
1.47
57
2.05
66
1.72
58
0.87
46
1.84
42
1.11
36
1.36
42
1.51
42
5.81
59
0.48
40
0.32
47
3.87
37
1.69
40
0.80
38
0.39
28
3.96
46
0.78
44
1.23
25
4.02
58
0.26
44
0.51
43
0.30
22
1.30
31
6.30
65
3.84
45
FADNettwo views1.92
43
0.44
20
0.60
27
0.54
36
1.05
41
0.00
1
2.95
72
0.78
24
0.38
22
0.81
25
0.61
31
2.40
45
0.47
39
0.65
63
5.57
59
1.73
41
0.69
34
1.08
56
8.93
68
1.61
62
2.36
41
2.52
39
0.09
25
0.36
37
2.54
55
3.77
50
3.43
31
5.46
51
NCCL2two views2.12
44
2.27
67
1.29
52
1.28
54
1.85
60
0.18
36
0.77
45
1.24
31
1.25
42
1.62
48
2.71
55
1.06
32
0.51
43
0.31
45
4.40
44
2.33
54
1.57
52
0.62
37
2.35
31
0.68
40
2.68
46
4.29
62
1.52
81
1.06
52
4.65
70
4.17
52
4.04
48
6.58
57
R-Stereo Traintwo views2.16
45
1.08
47
1.57
63
0.84
42
0.19
10
0.01
9
0.01
4
2.05
46
1.01
34
1.37
43
2.45
51
1.50
37
0.46
37
0.21
37
3.23
26
4.53
77
1.62
53
1.05
54
7.07
59
1.09
53
4.86
68
5.93
76
0.04
15
0.31
28
2.48
52
4.63
57
3.10
23
5.74
53
RAFT-Stereopermissivetwo views2.16
45
1.08
47
1.57
63
0.84
42
0.19
10
0.01
9
0.01
4
2.05
46
1.01
34
1.37
43
2.45
51
1.50
37
0.46
37
0.21
37
3.23
26
4.53
77
1.62
53
1.05
54
7.07
59
1.09
53
4.86
68
5.93
76
0.04
15
0.31
28
2.48
52
4.63
57
3.10
23
5.74
53
Lahav Lipson, Zachary Teed, and Jia Deng: RAFT-Stereo: Multilevel Recurrent Field Transforms for Stereo Matching. 3DV
GANetREF_RVCpermissivetwo views2.43
47
1.69
56
0.81
36
2.04
64
1.16
42
1.02
52
0.10
26
4.72
77
1.44
45
3.62
74
0.06
14
10.03
71
0.69
49
0.09
25
3.63
32
1.36
27
1.06
45
0.94
49
6.44
57
0.46
33
2.32
39
1.74
27
0.17
36
1.35
59
3.41
64
3.75
49
6.55
67
4.82
49
Zhang, Feihu and Prisacariu, Victor and Yang, Ruigang and Torr, Philip HS: GA-Net: Guided Aggregation Net for End- to-end Stereo Matching. CVPR 2019
RGCtwo views2.44
48
1.98
61
4.87
88
1.54
58
1.65
54
1.74
59
1.05
49
2.02
45
1.43
44
0.88
30
4.42
67
2.52
47
2.42
62
0.58
58
4.33
41
1.63
37
0.91
41
0.48
34
4.28
47
1.08
52
1.94
36
4.01
57
0.15
33
0.72
49
2.19
48
5.22
62
4.90
59
6.96
58
MLCVtwo views2.45
49
0.61
32
0.81
36
2.48
67
1.21
46
0.95
50
0.50
39
3.57
66
0.61
29
3.75
77
4.26
65
5.13
55
3.34
65
0.06
18
4.63
50
1.78
43
1.32
50
0.86
46
10.05
69
1.09
53
2.30
38
3.92
55
0.29
45
2.16
68
1.87
45
3.06
40
4.01
47
1.40
28
RTSCtwo views2.56
50
0.89
43
1.50
60
0.47
33
0.76
36
0.87
46
2.29
64
1.56
36
1.91
57
2.12
56
2.86
57
1.23
35
0.97
54
1.03
79
7.27
76
3.36
70
4.68
85
1.74
70
7.73
62
1.11
56
3.95
58
5.39
70
0.21
41
1.11
55
0.40
25
0.95
29
4.56
56
8.26
63
DeepPrunerFtwo views2.66
51
1.67
55
0.33
15
3.58
75
9.30
97
1.76
60
3.48
76
0.68
21
3.29
69
0.71
22
1.59
44
0.37
12
0.27
25
0.13
32
5.94
63
3.42
73
1.03
43
0.54
36
4.47
48
0.65
39
2.02
37
2.06
31
0.56
62
0.31
28
15.21
96
3.30
46
3.74
42
1.47
30
NOSS_ROBtwo views2.78
52
3.78
81
0.39
19
2.02
63
1.58
53
2.36
65
0.00
1
2.66
58
1.54
47
3.63
75
1.79
47
2.55
48
3.51
66
0.83
72
5.82
61
5.57
86
2.69
65
1.83
72
2.63
34
0.91
48
4.62
61
4.13
59
0.42
54
2.58
72
2.29
49
4.46
54
7.67
76
2.88
38
SGM-Foresttwo views2.81
53
2.03
63
0.36
17
6.32
86
1.79
59
2.66
66
0.02
11
1.42
33
0.77
30
2.52
57
0.96
36
6.57
62
5.74
69
0.31
45
4.36
42
2.94
60
0.61
25
0.33
25
3.08
37
0.47
34
2.46
44
2.73
43
0.84
73
0.45
41
3.66
65
3.74
48
11.30
84
7.44
60
Johannes L. Schönberger, Sudipta Sinha, Marc Pollefeys: Learning to Fuse Proposals from Multiple Scanline Optimizations in Semi-Global Matching. ECCV 2018
PWCDC_ROBbinarytwo views2.82
54
0.49
22
0.17
7
0.38
26
0.79
39
0.03
23
0.10
26
1.16
29
8.93
88
0.14
7
2.49
53
0.55
19
31.02
102
0.57
55
5.58
60
3.32
68
1.73
55
0.91
48
2.53
33
0.88
46
2.92
48
7.18
84
0.15
33
0.53
45
0.55
28
0.69
24
1.43
11
0.88
21
HSM-Net_RVCpermissivetwo views2.86
55
1.24
50
1.50
60
2.55
69
1.72
55
0.57
42
0.17
31
3.55
65
1.14
37
3.50
73
6.32
80
7.94
63
2.93
63
0.05
13
3.38
31
2.12
48
1.28
49
0.26
21
3.24
39
1.24
58
3.66
55
3.76
54
0.61
63
2.64
73
4.43
68
6.27
68
5.76
62
5.47
52
Gengshan Yang, Joshua Manela, Michael Happold, and Deva Ramanan: Hierarchical Deep Stereo Matching on High-resolution Images. CVPR 2019
ADCReftwo views2.96
56
2.78
74
1.02
46
0.70
39
2.21
67
1.96
62
2.42
67
0.96
27
6.10
78
1.94
53
1.15
40
1.08
33
0.74
52
1.32
84
6.01
64
3.38
71
1.34
51
1.15
60
8.41
65
2.09
69
2.59
45
3.60
53
1.88
83
0.71
48
2.81
58
8.00
77
3.42
30
10.24
65
NaN_ROBtwo views3.01
57
1.16
49
1.13
50
0.86
46
3.03
72
0.59
43
0.75
43
3.25
64
2.58
64
0.98
32
1.32
41
21.10
85
0.59
45
1.15
82
4.15
38
1.58
36
0.80
38
0.70
38
2.38
32
1.30
59
1.52
31
2.15
32
0.46
57
1.07
54
2.50
54
6.25
67
4.54
55
13.27
70
XPNet_ROBtwo views3.02
58
2.54
70
1.33
55
0.73
40
1.45
52
1.04
53
0.61
41
0.83
25
1.45
46
1.16
36
3.44
60
26.85
91
0.35
31
0.40
49
3.24
28
2.22
53
3.28
72
0.79
42
3.20
38
1.19
57
2.94
50
5.51
71
0.76
70
0.46
42
0.72
33
2.37
36
9.22
78
3.37
44
ccstwo views3.29
59
0.63
33
1.94
68
0.99
51
2.02
64
0.06
28
0.01
4
3.87
69
1.79
55
4.48
89
9.13
91
4.94
53
4.74
68
0.87
73
6.01
64
5.69
87
1.83
57
1.39
63
5.37
50
1.03
50
5.86
77
7.35
86
0.32
48
1.56
62
3.32
63
6.28
69
4.08
49
3.36
43
ADCP+two views3.47
60
2.05
64
2.64
77
1.31
56
1.93
61
1.19
55
3.99
77
1.07
28
9.83
90
1.78
51
5.75
76
8.79
67
0.37
34
0.90
74
5.04
56
2.20
52
0.44
20
0.90
47
5.82
53
0.88
46
2.38
42
3.43
52
0.33
49
0.41
39
3.03
61
12.72
89
3.59
37
10.90
66
FC-DCNNcopylefttwo views3.82
61
4.38
85
5.67
95
9.61
94
4.14
78
8.01
90
3.26
74
4.21
70
1.69
53
2.95
65
1.54
43
5.87
60
2.09
61
0.13
32
4.40
44
2.15
49
2.54
64
1.11
58
3.38
42
0.82
45
3.11
52
2.20
33
0.65
65
3.29
76
6.33
80
4.91
59
6.98
71
7.80
62
ADCLtwo views3.91
62
3.61
79
2.28
72
1.76
61
1.76
58
3.09
69
4.16
78
1.98
44
13.71
97
2.81
63
5.14
72
2.47
46
0.65
48
1.10
81
5.38
57
2.55
56
2.02
59
2.20
78
6.41
56
2.08
68
4.87
70
4.20
60
1.40
79
1.06
52
6.69
81
6.60
70
4.33
54
11.16
68
FADNet_RVCtwo views3.95
63
2.06
65
1.30
53
3.33
73
3.36
74
0.99
51
2.06
62
2.30
48
1.39
43
1.72
49
4.72
69
9.29
68
5.82
71
1.76
90
8.23
90
3.12
63
2.98
68
4.27
91
16.83
78
3.30
82
4.99
73
5.51
71
0.63
64
1.39
61
1.45
40
3.14
42
4.86
57
5.92
55
LALA_ROBtwo views4.04
64
4.73
87
1.42
58
1.16
53
1.93
61
2.17
64
1.04
48
1.80
40
2.48
62
2.74
60
5.87
77
33.34
94
0.69
49
0.78
69
6.02
66
2.33
54
3.21
71
1.82
71
3.66
43
1.88
65
3.40
54
4.99
65
0.34
50
1.03
51
1.91
46
3.61
47
8.50
77
6.20
56
iResNettwo views4.14
65
0.84
40
0.71
32
2.34
66
2.59
68
4.58
76
1.07
51
4.67
75
1.21
41
3.72
76
3.61
61
12.28
73
16.06
86
0.03
7
6.64
73
3.14
64
2.35
63
1.89
74
16.43
77
2.53
74
3.76
57
2.90
44
0.17
36
3.88
81
1.66
42
4.23
53
5.19
60
3.23
41
SHDtwo views4.34
66
1.02
45
2.50
75
1.71
60
1.33
49
1.46
56
2.01
59
4.90
78
4.30
76
2.63
58
4.99
71
4.39
52
9.41
78
0.58
58
8.20
88
4.81
80
4.43
84
2.44
82
13.01
74
2.95
79
5.61
76
6.88
82
0.30
46
3.26
75
4.11
66
5.16
61
3.88
46
10.98
67
CSANtwo views4.46
67
2.36
69
1.57
63
3.12
71
3.32
73
1.64
57
0.76
44
4.59
74
4.37
77
3.80
79
2.84
56
17.58
81
10.47
80
0.96
76
4.80
53
1.75
42
2.06
60
1.16
61
5.69
51
2.34
72
2.93
49
2.40
37
0.43
55
2.56
71
4.64
69
11.45
87
7.53
75
13.44
71
ETE_ROBtwo views4.62
68
4.04
83
1.52
62
1.14
52
2.02
64
2.70
67
1.28
53
1.50
34
4.25
75
1.89
52
5.88
78
37.95
99
0.84
53
1.42
88
4.94
55
2.66
57
3.81
79
0.95
50
4.84
49
1.63
63
4.01
59
5.32
69
0.67
68
1.14
56
2.03
47
3.84
51
15.25
91
7.23
59
XQCtwo views4.76
69
3.14
76
1.63
66
3.31
72
3.69
76
3.46
72
2.94
71
6.02
85
3.05
68
2.64
59
5.51
74
5.67
58
7.46
73
1.41
86
8.02
81
4.65
79
3.59
74
1.49
67
10.75
70
2.91
75
4.96
72
6.67
81
0.41
53
1.91
64
7.13
85
10.47
86
11.17
83
4.51
46
MSMDNettwo views4.98
70
2.59
72
3.57
81
2.54
68
0.90
40
3.99
73
1.63
56
4.52
73
1.56
48
4.54
90
18.92
110
20.82
84
12.09
81
0.60
62
6.57
72
5.84
88
2.28
62
1.52
68
7.08
61
1.94
67
6.09
78
6.12
78
0.31
47
2.29
69
1.47
41
2.83
39
4.25
53
7.49
61
CBMVpermissivetwo views5.52
71
3.65
80
1.32
54
3.46
74
2.86
71
2.04
63
0.41
36
3.64
67
2.63
65
4.09
83
2.36
49
14.13
76
3.70
67
0.57
55
7.16
75
6.19
92
4.26
82
2.36
81
8.76
67
1.44
60
6.43
80
4.53
64
0.54
61
5.62
91
4.85
71
6.08
65
19.41
95
26.62
97
Konstantinos Batsos, Changjiang Cai, Philippos Mordohai: CBMV: A Coalesced Bidirectional Matching Volume for Disparity Estimation. Computer Vision and Pattern Recognition (CVPR) 2018
RTStwo views5.53
72
1.46
51
0.94
41
0.90
47
1.73
56
4.08
74
4.63
82
2.53
53
3.41
70
2.79
61
1.70
45
8.18
64
1.27
57
0.72
65
7.83
78
4.87
81
7.37
97
2.34
79
18.33
80
2.92
76
4.78
64
5.88
74
1.04
74
1.31
57
6.74
82
22.88
108
6.66
68
22.00
87
RTSAtwo views5.53
72
1.46
51
0.94
41
0.90
47
1.73
56
4.08
74
4.63
82
2.53
53
3.41
70
2.79
61
1.70
45
8.18
64
1.27
57
0.72
65
7.83
78
4.87
81
7.37
97
2.34
79
18.33
80
2.92
76
4.78
64
5.88
74
1.04
74
1.31
57
6.74
82
22.88
108
6.66
68
22.00
87
AnyNet_C32two views5.68
74
1.96
59
2.91
78
1.78
62
2.71
70
3.17
70
5.76
87
4.31
71
12.56
92
3.07
67
7.17
82
4.22
51
1.70
59
2.19
91
8.22
89
5.98
90
5.69
90
4.43
95
12.45
73
2.93
78
9.72
96
8.57
96
1.84
82
3.46
78
6.24
79
9.82
84
6.26
64
14.21
72
CBMV_ROBtwo views6.09
75
2.65
73
2.29
73
8.35
91
5.74
85
5.08
77
2.76
70
5.03
80
2.35
60
3.78
78
4.70
68
21.22
86
13.56
84
0.09
25
6.04
67
3.70
75
2.84
66
1.39
63
11.35
71
8.80
101
4.92
71
3.11
48
0.43
55
2.10
67
7.06
84
10.12
85
12.96
86
11.93
69
ADCPNettwo views6.67
76
1.97
60
3.86
84
2.75
70
6.19
89
3.07
68
7.60
94
5.21
82
8.03
85
3.82
80
7.92
85
5.17
56
13.78
85
2.98
95
8.10
84
6.12
91
4.35
83
2.04
77
18.50
82
4.74
87
7.04
85
7.43
87
2.98
90
1.91
64
8.96
88
14.04
94
7.02
73
14.47
74
AMNettwo views6.77
77
3.87
82
2.62
76
4.00
79
5.86
86
6.75
84
3.31
75
2.51
52
2.47
61
3.44
71
6.68
81
9.34
69
2.05
60
7.44
107
4.79
52
9.81
105
13.48
106
7.65
106
7.95
64
7.83
96
11.47
100
7.62
91
2.74
88
4.18
82
5.52
74
8.25
79
13.24
87
17.97
81
ADCMidtwo views6.83
78
4.46
86
4.95
89
2.16
65
4.31
81
5.71
81
7.29
91
8.83
87
12.78
94
3.11
68
8.10
87
9.83
70
9.65
79
2.84
92
8.51
92
5.42
85
4.68
85
2.83
84
18.87
85
3.22
80
7.09
86
7.70
92
5.37
98
2.33
70
5.12
73
8.71
81
5.21
61
15.24
75
PWC_ROBbinarytwo views7.15
79
1.85
57
5.57
93
0.85
45
1.17
44
1.95
61
4.77
84
2.56
55
17.27
107
3.83
81
7.66
84
4.97
54
39.20
110
0.81
71
8.35
91
4.96
84
6.51
93
4.27
91
15.99
76
4.39
84
7.48
88
7.59
90
0.53
60
4.79
87
2.74
56
2.63
38
7.01
72
23.26
90
FAT-Stereotwo views7.20
80
2.13
66
8.68
100
7.00
87
11.87
102
5.95
82
3.19
73
7.26
86
3.52
72
5.84
95
13.06
98
18.26
82
8.49
74
1.41
86
8.81
94
6.34
93
3.65
76
4.07
90
15.49
75
6.21
91
6.75
83
6.94
83
0.74
69
4.80
88
2.92
60
7.26
72
7.50
74
16.30
77
MFN_U_SF_DS_RVCtwo views7.30
81
3.59
78
1.09
48
9.20
92
3.71
77
7.69
88
7.51
93
10.81
92
8.52
86
4.93
92
4.20
64
16.73
80
9.30
77
5.45
105
8.09
83
3.29
66
1.80
56
1.13
59
18.77
83
2.10
70
6.48
81
7.49
88
3.17
94
1.74
63
12.06
94
14.29
95
9.71
80
14.36
73
FBW_ROBtwo views7.98
82
9.24
98
5.57
93
20.21
102
8.25
95
12.08
95
8.09
95
4.92
79
1.58
51
3.30
70
5.15
73
13.88
75
3.03
64
7.45
108
6.23
69
1.49
34
2.89
67
0.79
42
33.18
102
1.93
66
5.24
74
4.40
63
9.43
105
8.29
102
4.88
72
7.69
74
4.88
58
21.40
85
pmcnntwo views8.05
83
2.96
75
2.25
71
6.23
84
5.03
82
6.98
85
4.33
80
4.70
76
2.02
59
4.19
84
9.20
93
19.11
83
20.39
92
0.73
67
7.05
74
6.73
94
6.29
92
4.51
96
22.54
92
4.66
86
9.62
95
5.08
67
4.46
97
4.23
83
5.99
75
5.22
62
17.26
93
25.55
96
S-Stereotwo views9.11
84
1.88
58
3.41
80
7.93
90
7.92
93
12.47
96
19.24
107
5.20
81
6.26
80
6.18
100
20.72
113
29.27
92
7.13
72
0.65
63
9.55
98
7.50
98
3.80
78
5.87
101
17.77
79
6.30
92
6.21
79
7.52
89
12.23
108
6.88
97
2.77
57
6.68
71
6.74
70
18.03
82
ADCStwo views9.93
85
5.64
90
4.82
87
3.95
78
6.05
88
6.17
83
5.70
86
9.47
89
15.68
98
4.19
84
7.98
86
12.91
74
17.41
87
3.94
102
10.52
103
11.68
109
10.14
99
7.14
105
24.46
96
8.67
100
13.95
106
13.00
109
2.94
89
6.15
94
9.75
91
17.65
100
10.70
82
17.41
78
SGM_RVCbinarytwo views10.08
86
10.50
100
3.60
82
16.68
97
8.80
96
14.87
98
7.11
90
9.19
88
6.25
79
4.06
82
7.33
83
24.89
88
18.29
89
0.57
55
4.57
48
1.98
46
3.30
73
1.17
62
20.75
88
2.20
71
4.80
66
2.23
34
2.29
85
3.65
79
17.68
101
13.31
91
33.30
105
28.67
99
Heiko Hirschmueller: Stereo processing by semiglobal matching and mutual information. TPAMI 2008, Volume 30(2), pp. 328-341
MDST_ROBtwo views10.13
87
16.70
107
3.04
79
29.06
106
9.34
98
20.66
104
5.02
85
5.90
84
1.98
58
3.16
69
2.70
54
11.22
72
28.15
98
0.18
35
6.30
70
1.37
29
1.21
48
0.34
26
7.74
63
1.01
49
4.67
62
3.95
56
1.92
84
1.94
66
26.55
105
19.78
102
37.02
108
22.50
89
SANettwo views10.32
88
4.15
84
6.27
96
5.51
82
5.49
84
8.58
92
2.27
63
15.41
102
13.03
96
2.83
64
9.09
90
60.35
113
25.50
95
2.93
94
8.06
82
3.02
62
7.05
94
3.86
89
11.43
72
3.25
81
5.27
75
5.02
66
2.60
87
7.27
100
11.08
92
12.57
88
13.36
88
23.31
91
MFN_U_SF_RVCtwo views10.34
89
6.17
95
3.69
83
9.55
93
5.13
83
12.52
97
10.34
98
12.25
95
7.05
81
12.23
108
14.31
101
22.82
87
13.03
82
8.82
109
11.37
107
4.87
81
3.83
81
3.31
86
39.34
106
5.97
90
8.56
93
8.81
97
1.45
80
6.40
96
9.49
90
8.57
80
9.56
79
19.71
83
ccs_robtwo views10.35
90
5.69
91
2.14
70
4.64
81
2.67
69
5.53
80
6.85
89
4.40
72
4.18
74
5.98
97
10.34
95
26.00
89
37.16
108
3.25
98
19.47
114
10.00
106
11.64
104
3.40
87
48.57
109
13.46
105
7.76
89
10.45
102
0.80
71
4.63
85
2.91
59
6.10
66
6.08
63
15.43
76
G-Nettwo views10.87
91
4.96
88
9.71
101
3.77
76
4.24
79
7.37
86
2.01
59
10.08
90
15.80
100
4.95
93
15.59
103
48.52
108
9.05
75
0.58
58
8.11
86
8.67
100
5.49
87
4.42
93
23.80
93
15.66
108
8.01
91
8.23
94
3.06
92
5.14
89
6.15
77
13.92
92
22.56
97
23.75
92
STTStereo_v2two views10.87
91
4.96
88
9.71
101
3.77
76
4.24
79
7.37
86
2.01
59
10.08
90
15.80
100
4.95
93
15.59
103
48.52
108
9.05
75
0.58
58
8.11
86
8.67
100
5.49
87
4.42
93
23.80
93
15.66
108
8.01
91
8.23
94
3.06
92
5.14
89
6.15
77
13.92
92
22.56
97
23.75
92
MSC_U_SF_DS_RVCtwo views10.98
93
10.59
101
6.33
97
17.49
98
6.29
90
7.85
89
10.69
99
18.03
106
23.43
114
11.13
107
4.86
70
14.66
77
5.81
70
9.65
111
9.78
99
5.91
89
3.18
70
5.17
97
39.06
105
7.03
95
7.92
90
9.81
101
7.04
99
7.22
99
14.77
95
8.85
82
6.40
66
17.43
79
edge stereotwo views11.35
94
6.83
96
10.52
103
14.06
95
12.00
103
16.55
100
12.81
101
12.42
97
7.44
83
6.06
98
15.12
102
38.81
101
17.75
88
1.25
83
7.59
77
7.29
96
5.73
91
3.11
85
25.18
97
4.53
85
6.71
82
8.01
93
4.16
95
3.71
80
8.94
87
7.65
73
13.41
89
28.72
100
SGM-ForestMtwo views11.60
95
15.35
105
4.54
85
19.02
100
8.08
94
18.74
103
10.83
100
12.03
93
7.24
82
3.01
66
6.09
79
36.06
98
23.95
93
0.75
68
4.47
46
1.51
35
3.60
75
0.75
40
20.06
87
1.60
61
4.23
60
1.65
25
1.36
78
3.34
77
25.09
104
20.48
104
34.27
107
25.03
95
AnyNet_C01two views12.12
96
6.13
94
5.39
91
6.24
85
6.03
87
5.20
78
9.20
96
12.30
96
16.87
104
5.85
96
9.03
89
26.16
90
18.91
90
4.53
104
11.14
106
12.98
110
16.27
107
15.82
112
34.24
103
10.36
103
16.44
107
13.34
110
7.06
100
6.93
98
11.61
93
9.46
83
9.77
81
19.91
84
PVDtwo views12.14
97
5.88
92
8.51
99
7.70
89
6.65
91
10.05
93
6.16
88
21.11
111
19.74
110
3.44
71
8.69
88
16.07
78
27.50
97
2.86
93
8.93
95
7.52
99
11.20
102
6.11
103
24.24
95
6.83
94
11.08
97
11.21
104
1.11
76
5.71
92
15.30
97
21.90
106
17.71
94
34.49
105
MeshStereopermissivetwo views13.21
98
10.28
99
5.53
92
24.29
105
14.71
104
17.26
101
7.30
92
12.03
93
7.47
84
4.66
91
9.13
91
38.56
100
29.80
99
0.45
51
9.43
97
7.37
97
3.76
77
2.44
82
18.85
84
2.41
73
12.93
104
7.18
84
1.21
77
5.79
93
22.04
103
16.38
98
33.15
104
32.34
103
C. Zhang, Z. Li, Y. Cheng, R. Cai, H. Chao, Y. Rui: MeshStereo: A Global Stereo Model with Mesh Alignment Regularization for View Interpolation. ICCV 2015
DPSNettwo views14.25
99
3.23
77
1.83
67
7.63
88
7.22
92
10.44
94
4.19
79
5.68
83
8.64
87
4.20
86
3.75
62
72.37
115
26.68
96
4.43
103
10.42
101
10.25
108
10.56
100
8.02
107
58.50
113
8.44
99
11.36
99
8.93
98
16.95
111
10.34
107
18.62
102
15.80
97
14.33
90
21.91
86
MFMNet_retwo views16.26
100
13.95
103
7.70
98
24.27
104
9.34
98
33.99
111
17.43
103
12.42
97
19.62
109
10.23
106
13.58
100
39.02
102
13.47
83
1.40
85
10.73
104
10.12
107
29.43
113
12.22
109
31.31
99
10.07
102
17.19
108
9.15
99
8.44
103
9.02
104
8.97
89
5.68
64
24.08
99
36.04
107
STStereotwo views19.12
101
25.20
111
16.96
113
40.13
111
26.93
111
32.21
109
21.77
110
19.35
109
16.49
103
7.75
103
17.60
107
46.83
106
36.96
107
1.06
80
8.10
84
2.99
61
3.81
79
1.57
69
31.85
100
5.18
89
6.77
84
5.31
68
3.04
91
9.71
106
29.57
106
23.01
110
40.22
109
35.81
106
ELAS_RVCcopylefttwo views19.26
102
19.85
109
14.31
108
33.92
108
22.40
107
33.39
110
19.24
107
15.07
100
10.89
91
8.20
104
17.30
106
54.84
112
32.79
104
3.23
97
8.72
93
3.46
74
11.25
103
6.19
104
19.97
86
7.86
97
12.63
103
12.02
107
9.45
106
10.70
108
30.22
108
22.17
107
41.23
111
38.75
110
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
SAMSARAtwo views19.29
103
2.55
71
5.07
90
4.09
80
3.44
75
3.26
71
4.38
81
15.32
101
9.58
89
4.44
87
5.63
75
8.73
66
24.82
94
3.30
100
10.45
102
6.81
95
5.49
87
3.46
88
21.26
89
4.77
88
17.96
110
24.74
115
0.82
72
4.77
86
80.12
118
91.25
118
77.91
117
76.45
117
MADNet+two views19.30
104
7.92
97
15.32
110
17.51
99
11.42
101
8.09
91
18.41
105
28.03
114
17.21
106
12.46
109
12.08
96
33.63
95
30.37
100
9.62
110
18.64
113
14.89
112
25.78
111
10.10
108
40.05
107
16.50
110
17.34
109
12.76
108
9.48
107
18.60
113
35.02
114
20.22
103
21.19
96
38.41
108
ELAScopylefttwo views19.48
105
22.90
110
14.50
109
35.26
110
24.25
110
31.48
108
19.80
109
16.72
104
15.91
102
7.71
102
15.62
105
48.88
110
30.47
101
3.20
96
9.19
96
3.39
72
10.97
101
5.79
100
21.49
90
6.53
93
11.69
101
11.90
106
9.30
104
9.63
105
32.86
109
24.17
111
41.19
110
41.31
111
A. Geiger, M. Roser, R. Urtasun: Efficient large-scale stereo matching. ACCV 2010
MANEtwo views19.83
106
17.14
108
15.43
111
34.46
109
23.74
109
25.27
106
17.85
104
19.50
110
17.04
105
6.13
99
12.40
97
41.03
103
35.23
106
3.36
101
7.87
80
8.98
102
7.09
95
29.77
115
21.81
91
3.71
83
9.51
94
5.54
73
2.50
86
8.83
103
34.12
111
27.95
113
51.25
112
47.85
113
SPS-STEREOcopylefttwo views20.13
107
14.92
104
11.88
104
19.84
101
20.39
105
18.24
102
9.27
97
16.06
103
19.74
110
19.91
113
20.06
112
33.66
96
37.68
109
7.23
106
13.17
110
16.66
114
26.75
112
16.01
113
32.30
101
23.65
114
30.25
115
22.84
114
13.67
109
13.47
111
16.56
99
15.26
96
29.55
102
24.45
94
K. Yamaguchi, D. McAllester, R. Urtasun: Efficient Joint Segmentation, Occlusion Labeling, Stereo and Flow Estimation. ECCV 2014
SGM+DAISYtwo views20.91
108
16.10
106
12.43
105
22.81
103
21.80
106
24.45
105
18.63
106
17.89
105
12.77
93
18.42
112
19.55
111
30.39
93
40.48
111
11.67
112
12.56
109
15.10
113
25.73
110
12.69
110
34.93
104
21.56
113
26.12
114
22.15
113
21.16
112
12.73
110
17.44
100
13.30
90
33.35
106
28.26
98
LE_ROBtwo views22.46
109
10.87
102
12.67
106
32.44
107
37.93
112
31.36
107
33.43
112
23.22
112
21.14
113
6.73
101
17.81
108
52.15
111
50.94
116
1.56
89
6.43
71
2.92
58
3.14
69
1.46
66
30.14
98
8.43
98
7.23
87
4.28
61
7.08
101
25.69
114
34.50
113
36.15
114
53.30
113
53.30
114
LSM0two views22.48
110
6.09
93
4.65
86
14.89
96
10.39
100
16.43
99
13.29
102
23.38
113
15.69
99
14.75
111
18.69
109
35.79
97
33.70
105
0.93
75
73.52
117
85.94
117
13.36
105
5.62
99
55.67
112
16.83
111
21.95
112
18.82
112
4.28
96
8.01
101
16.44
98
16.43
99
28.65
100
32.66
104
BEATNet-Init1two views23.62
111
27.86
112
13.22
107
44.78
112
23.54
108
38.58
112
23.96
111
18.57
107
18.15
108
9.17
105
13.26
99
47.63
107
48.00
115
3.29
99
12.31
108
9.22
103
7.35
96
5.48
98
48.70
110
19.34
112
12.33
102
11.24
105
7.11
102
11.28
109
33.76
110
21.63
105
60.52
115
47.50
112
DispFullNettwo views26.27
112
39.81
113
17.89
114
52.75
113
38.74
113
45.03
113
55.21
114
13.74
99
12.89
95
13.75
110
9.88
94
43.75
105
32.46
103
86.51
117
11.01
105
13.68
111
29.48
114
16.23
114
51.78
111
14.64
106
13.87
105
10.46
103
14.59
110
6.22
95
8.62
86
8.03
78
16.08
92
32.10
102
PSMNet_ROBtwo views31.36
113
48.75
114
16.32
112
64.48
117
46.97
115
60.77
116
61.30
115
19.02
108
20.70
112
27.57
115
24.90
114
41.43
104
42.18
112
83.15
116
13.95
111
4.06
76
17.90
108
5.94
102
42.51
108
15.37
107
11.21
98
9.54
100
57.95
116
14.27
112
29.64
107
19.74
101
29.38
101
17.81
80
PWCKtwo views38.25
114
63.47
117
23.98
115
73.60
118
41.23
114
60.98
117
51.31
113
54.00
116
35.86
115
26.51
114
39.23
115
62.00
114
46.09
113
26.43
113
14.46
112
9.53
104
30.80
115
12.71
111
83.96
116
28.46
115
23.42
113
13.60
111
47.98
114
30.51
115
34.48
112
27.91
112
31.53
103
38.61
109
DPSimNet_ROBtwo views58.77
115
55.98
115
59.21
117
62.88
116
73.92
116
60.20
115
64.84
117
28.34
115
91.63
118
80.59
117
52.65
116
74.49
116
47.24
114
90.64
118
55.91
116
45.99
116
54.62
116
42.30
116
68.03
114
59.25
116
51.18
116
59.54
117
44.54
113
45.19
116
44.85
115
61.56
115
56.15
114
55.05
115
CC-Net-ROBtwo views70.08
116
57.59
116
33.20
116
55.61
114
76.68
117
84.28
118
62.79
116
83.71
118
83.30
117
76.21
116
74.67
117
74.97
117
61.84
117
62.54
115
41.03
115
41.52
115
86.04
117
88.34
118
89.47
117
90.97
117
75.66
117
54.62
116
53.92
115
50.83
117
74.81
117
79.04
117
83.52
118
94.87
119
MADNet++two views76.54
117
72.15
118
83.11
118
61.53
115
81.21
118
58.42
114
69.47
118
68.24
117
78.58
116
85.63
118
92.37
118
80.60
118
85.04
118
36.71
114
81.19
118
89.82
118
88.15
118
69.64
117
82.54
115
91.39
118
92.32
118
93.48
118
65.75
117
90.26
119
59.09
116
74.29
116
74.25
116
61.32
116
MEDIAN_ROBtwo views97.56
118
99.08
119
98.39
119
100.00
119
99.99
119
100.00
121
100.00
121
99.49
121
98.31
119
100.00
121
99.95
121
97.97
121
96.05
119
99.97
119
99.41
119
98.43
119
98.53
119
98.66
119
99.88
120
97.95
119
93.38
119
96.45
119
100.00
118
97.23
120
86.96
119
92.33
119
89.80
119
95.90
120
AVERAGE_ROBtwo views98.95
119
100.00
122
99.79
120
100.00
119
100.00
120
100.00
121
100.00
121
100.00
122
100.00
120
100.00
121
100.00
122
100.00
124
100.00
122
100.00
120
99.77
120
99.85
120
100.00
122
100.00
122
100.00
121
100.00
122
98.12
120
99.51
120
100.00
118
100.00
121
93.43
120
93.75
120
94.35
120
93.16
118
DGTPSM_ROBtwo views99.52
120
99.77
120
100.00
121
100.00
119
100.00
120
100.00
121
100.00
121
99.17
119
100.00
120
99.31
119
99.64
120
91.36
119
99.13
120
100.00
120
99.94
121
100.00
121
99.82
120
99.99
120
99.53
118
99.99
120
99.80
122
99.66
121
100.00
118
100.00
121
100.00
121
100.00
121
100.00
121
100.00
121
DPSMNet_ROBtwo views99.53
121
99.78
121
100.00
121