This table lists the benchmark results for the low-res many-view scenario. The following metrics are evaluated:

(*) For exact definitions, detailing how potentially incomplete ground truth is taken into account, see our paper.

The datasets are grouped into different categories, and result averages are computed for a category and method if results of the method are available for all datasets within the category. Note that the category "all" includes both the high-res multi-view and the low-res many-view scenarios.

Methods with suffix _ROB may participate in the Robust Vision Challenge.

Click a dataset result cell to show a visualization of the reconstruction. For training datasets, ground truth and accuracy / completeness visualizations are also available. The visualizations may not work with mobile browsers.




Method Infoalllow-res
many-view
indooroutdoorlakesidesand boxstorage roomstorage room 2tunnel
sort bysort bysort bysorted bysort bysort bysort bysort bysort by
DeepPCF-MVS98.18 396.68 295.10 397.74 198.30 197.78 592.41 497.79 397.12 3
DeepC-MVS_fast98.69 196.77 195.30 297.74 198.24 297.85 392.74 397.86 297.13 2
Andreas Kuhn, Christian Sormann, Mattia Rossi, Oliver Erdler, Friedrich Fraundorfer: DeepC-MVS: Deep Confidence Prediction for Multi-View Stereo Reconstruction. 3DV 2020
DeepC-MVS98.35 296.09 494.20 1297.34 397.94 497.39 1190.85 1597.56 496.71 5
Andreas Kuhn, Christian Sormann, Mattia Rossi, Oliver Erdler, Friedrich Fraundorfer: DeepC-MVS: Deep Confidence Prediction for Multi-View Stereo Reconstruction. 3DV 2020
PCF-MVS97.08 1495.68 793.36 1697.23 498.16 396.93 1591.43 1295.29 2296.61 6
Andreas Kuhn, Shan Lin, Oliver Erdler: Plane Completion and Filtering for Multi-View Stereo Reconstruction. GCPR 2019
COLMAP(SR)96.07 594.39 997.19 596.86 897.92 291.54 1197.24 996.79 4
tm-dncc96.59 395.98 197.00 696.82 996.51 2194.07 197.88 197.68 1
ACMH+97.24 1095.61 993.77 1596.83 796.69 1297.71 790.70 1696.85 1396.09 11
ACMH97.28 895.29 1393.25 1796.65 896.32 1397.93 189.82 1996.68 1695.69 14
Qingshan Xu and Wenbing Tao: Multi-Scale Geometric Consistency Guided Multi-View Stereo. CVPR 2019
ACMP97.20 1195.11 1493.01 2196.52 995.68 1897.26 1389.74 2096.27 1896.61 6
Qingshan Xu and Wenbing Tao: Planar Prior Assisted PatchMatch Multi-View Stereo. AAAI 2020
TAPA-MVS(SR)95.82 694.80 496.49 1096.19 1497.58 1092.04 597.56 495.71 13
ACMM97.58 595.68 794.49 796.48 1196.12 1597.20 1492.04 596.94 1296.10 10
Qingshan Xu and Wenbing Tao: Multi-Scale Geometric Consistency Guided Multi-View Stereo. CVPR 2019
COLMAP(base)95.42 1194.03 1496.35 1295.96 1696.92 1691.10 1496.97 1096.16 9
TAPA-MVS97.07 1595.48 1094.55 596.10 1395.68 1897.30 1291.73 997.37 795.32 16
Andrea Romanoni, Matteo Matteucci: TAPA-MVS: Textureless-Aware PAtchMatch Multi-View Stereo. ICCV 2019
PLCcopyleft97.94 495.35 1294.28 1196.06 1495.26 2296.59 1891.61 1096.96 1196.34 8
Jie Liao, Yanping Fu, Qingan Yan, Chunxia xiao: Pyramid Multi-View Stereo with Local Consistency. Pacific Graphics 2019
LTVRE_ROB97.16 1295.10 1594.13 1395.75 1595.69 1795.73 2492.87 295.39 2195.83 12
Andreas Kuhn, Heiko Hirschmüller, Daniel Scharstein, Helmut Mayer: A TV Prior for High-Quality Scalable Multi-View Stereo Reconstruction. International Journal of Computer Vision 2016
BP-MVSNet94.57 1893.20 1895.49 1696.77 1194.34 3389.67 2196.74 1495.35 15
Christian Sormann, Patrick Knöbelreiter, Andreas Kuhn, Mattia Rossi, Thomas Pock, Friedrich Fraundorfer: BP-MVSNet: Belief-Propagation-Layers for Multi-View-Stereo. 3DV 2020
3Dnovator97.25 992.47 2287.97 3095.47 1797.93 597.85 386.02 2689.92 3290.64 24
GSE95.00 1694.33 1095.45 1894.81 2396.53 2091.98 796.68 1695.02 18
COLMAP_ROBcopyleft97.56 694.43 1993.09 1995.33 1994.55 2496.37 2389.45 2296.72 1595.07 17
Johannes L. Schönberger, Enliang Zheng, Marc Pollefeys, Jan-Michael Frahm: Pixelwise View Selection for Unstructured Multi-View Stereo. ECCV 2016
A-TVSNet + Gipumacopyleft94.96 1794.54 695.24 2095.31 2196.58 1991.82 897.27 893.83 19
3Dnovator+97.12 1392.17 2387.57 3195.24 2097.49 697.68 885.24 2989.91 3390.54 25
OpenMVScopyleft96.50 1691.77 2586.65 3495.19 2297.38 797.66 984.60 3288.69 3890.53 26
IB-MVS95.67 1894.34 2094.41 894.29 2395.45 2097.77 691.37 1397.44 689.67 28
Christian Sormann, Mattia Rossi, Andreas Kuhn and Friedrich Fraundorfer: IB-MVS: An Iterative Algorithm for Deep Multi-View Stereo based on Binary Decisions. BMVC 2021
HY-MVS97.30 793.59 2193.02 2093.97 2492.85 2996.87 1790.03 1896.00 1992.19 21
tmmvs90.82 2786.27 3593.86 2596.82 996.51 2183.91 3588.63 3988.24 31
PVSNet_094.43 1990.46 2885.77 3693.59 2693.82 2695.33 2784.62 3186.92 4191.61 22
PVSNet96.02 1789.22 3384.53 4092.35 2793.29 2894.60 3185.85 2783.20 4589.17 29
CIDER88.55 3483.13 4292.16 2893.95 2595.61 2683.79 3682.47 4886.92 33
Qingshan Xu and Wenbing Tao: Learning Inverse Depth Regression for Multi-View Stereo with Correlation Cost Volume. AAAI 2020
R-MVSNet90.93 2689.80 2491.68 2990.81 3494.15 3487.67 2391.93 2890.07 27
LPCS92.07 2492.93 2291.49 3090.32 3593.29 3590.66 1795.20 2390.87 23
OpenMVS_ROBcopyleft92.34 2087.46 3681.75 4391.27 3192.55 3195.21 2981.21 4182.29 5086.05 35
PVSNet_LR89.56 3087.11 3291.19 3290.87 3394.45 3283.35 3790.86 3188.25 30
test_112689.66 2988.06 2990.73 3393.40 2795.29 2884.99 3091.13 3083.50 39
Pnet_fast80.38 4765.52 7290.29 3490.03 3794.91 3060.54 7270.51 7585.93 36
CPR_FA89.35 3288.37 2890.00 3589.38 3887.69 4687.29 2589.46 3692.93 20
test_120589.53 3189.08 2789.83 3692.63 3095.71 2585.34 2892.82 2781.15 47
test_mvsss85.35 3779.90 4488.98 3790.30 3689.87 4268.09 5891.71 2986.77 34
ANet-0.7587.89 3589.58 2586.76 3884.37 5290.29 3984.51 3394.65 2485.61 37
Pnet-new-79.46 4969.54 6586.08 3984.76 4690.31 3865.13 6273.95 7083.16 41
unsupervisedMVS_cas83.12 4278.75 4586.04 4085.85 4289.90 4175.64 4681.86 5282.36 42
CasMVSNet(SR_A)79.00 5069.56 6485.29 4185.95 4192.82 3662.08 6977.05 6177.11 52
ANet85.16 3985.20 3885.14 4284.37 5290.29 3982.00 3988.39 4080.78 49
mvs_zhu_103082.18 4478.09 5084.91 4382.35 5788.99 4371.71 5484.46 4483.39 40
CasMVSNet(base)78.10 5568.51 6884.50 4485.47 4391.56 3759.79 7577.22 6076.47 53
unMVSv183.89 4084.37 4183.58 4584.00 5685.63 4983.17 3885.56 4281.11 48
MVSNet_plusplus76.50 5966.38 7083.25 4691.25 3273.79 6653.62 8179.14 5784.70 38
MVS_test_180.50 4677.21 5282.70 4786.63 4082.92 5564.61 6389.81 3478.53 51
metmvs_fine83.50 4184.90 3982.56 4884.72 4781.80 5981.09 4288.72 3781.17 46
AttMVS80.32 4878.27 4881.68 4978.98 5986.16 4879.76 4476.79 6579.89 50
P-MVSNet85.35 3791.57 2381.21 5078.76 6082.87 5687.56 2495.58 2082.00 44
Pnet-blend++78.13 5373.62 5981.13 5184.46 5088.20 4466.50 5980.74 5470.74 69
Snet74.39 6164.28 7481.13 5188.04 3973.18 6960.15 7368.41 7682.18 43
Pnet-blend78.13 5373.62 5981.13 5184.46 5088.20 4466.50 5980.74 5470.74 69
hgnet78.82 5176.20 5480.57 5484.50 4882.32 5775.56 4776.84 6274.88 59
DPSNet78.82 5176.20 5480.57 5484.50 4882.32 5775.56 4776.84 6274.88 59
example74.48 6065.59 7180.41 5684.10 5583.16 5462.82 6868.36 7773.98 63
MVSNet_++71.81 6959.04 8080.33 5784.86 4574.23 6533.06 8485.02 4381.88 45
MVSCRF77.21 5774.57 5778.98 5879.29 5883.87 5375.01 4974.12 6973.78 64
A1Net82.90 4389.39 2678.57 5985.19 4463.18 7484.36 3494.42 2587.34 32
test_112471.63 7063.97 7576.73 6069.39 7685.55 5056.73 7671.21 7275.26 57
RMVSNet80.62 4586.71 3376.57 6184.14 5478.09 6180.22 4393.20 2667.48 71
vp_mvsnet69.76 7360.05 7976.24 6276.16 6287.57 4748.96 8371.14 7364.99 73
MVSNet73.65 6270.01 6376.07 6376.39 6177.59 6261.93 7078.09 5974.23 62
MVEpermissive76.82 2176.63 5877.52 5176.03 6470.40 7485.49 5175.68 4579.36 5672.20 66
Simon Fuhrmann, Fabian Langguth, Michael Goesele: MVE - A Multi-View Reconstruction Environment. EUROGRAPHICS Workshops on Graphics and Cultural Heritage (2014)
MVSNet + Gipuma72.61 6671.20 6173.54 6575.39 6473.46 6865.60 6176.81 6471.78 67
F/T MVSNet+Gipuma69.80 7264.46 7373.36 6675.05 6573.57 6760.74 7168.18 7871.46 68
Pnet-eth77.75 5685.52 3772.57 6775.57 6365.89 7281.44 4089.61 3576.26 54
unMVSmet66.94 7560.67 7871.11 6871.91 7079.70 6054.62 7866.73 7961.74 74
firsttry72.49 6775.56 5670.44 6968.80 7770.25 7069.31 5681.82 5372.28 65
PSD-MVSNet73.38 6378.51 4669.96 7072.98 6862.49 7674.33 5082.69 4774.41 61
SGNet73.32 6478.38 4769.94 7172.09 6962.62 7573.95 5182.81 4675.11 58
test373.10 6578.15 4969.73 7271.16 7162.14 7873.89 5282.42 4975.89 56
TVSNet72.34 6876.78 5369.38 7370.64 7361.58 8071.28 5582.27 5175.92 55
CasMVSNet(SR_B)70.07 7171.18 6269.33 7469.73 7577.06 6364.06 6478.31 5861.20 75
PMVScopyleft70.75 2258.19 8143.22 8268.18 7573.06 6784.09 5254.23 7932.21 8247.37 81
Y. Furukawa, J. Ponce: Accurate, dense, and robust multiview stereopsis. PAMI (2010)
confMetMVS63.51 7760.95 7765.21 7660.95 8276.48 6455.53 7766.38 8058.21 76
QQQNet67.18 7474.04 5862.60 7771.02 7262.30 7773.86 5374.22 6654.48 79
CCVNet64.24 7667.70 6961.94 7865.09 7866.24 7163.67 6571.73 7154.48 79
test_1120copyleft58.74 8057.03 8159.88 7949.42 8464.79 7349.65 8264.41 8165.45 72
SVVNet62.40 7868.72 6658.19 8063.36 7954.86 8163.22 6674.22 6656.34 77
ternet62.40 7868.72 6658.19 8063.36 7954.86 8163.22 6674.22 6656.34 77
Cas-MVS_preliminary56.71 8262.57 7652.80 8250.07 8361.96 7954.15 8071.00 7446.38 82
test_robustmvs44.81 8363.28 8145.49 8359.95 7425.66 83
FADENet18.58 8315.51 8320.63 8421.45 8532.74 8515.09 8515.93 837.68 84
CMPMVSbinary69.68 2311.01 840.97 8417.70 8510.09 8643.01 841.93 860.00 840.00 85
M. Jancosek, T. Pajdla: Multi-View Reconstruction Preserving Weakly-Supported Surfaces. CVPR 2011
dnet0.00 850.00 850.00 860.00 870.00 860.00 870.00 840.00 85
test_MVS69.14 57
UnsupFinetunedMVSNet75.05 65