This table lists the benchmark results for the low-res many-view scenario. The following metrics are evaluated:

(*) For exact definitions, detailing how potentially incomplete ground truth is taken into account, see our paper.

The datasets are grouped into different categories, and result averages are computed for a category and method if results of the method are available for all datasets within the category. Note that the category "all" includes both the high-res multi-view and the low-res many-view scenarios.

Methods with suffix _ROB may participate in the Robust Vision Challenge.

Click a dataset result cell to show a visualization of the reconstruction. For training datasets, ground truth and accuracy / completeness visualizations are also available. The visualizations may not work with mobile browsers.




Method Infoalllow-res
many-view
indooroutdoorlakesidesand boxstorage roomstorage room 2tunnel
sort bysort bysort bysort bysort bysorted bysort bysort bysort by
MVSNet91.32 884.12 1796.13 498.67 399.19 173.29 2194.95 390.53 35
CasMVSNet(SR_B)89.61 1577.50 3197.68 198.62 499.07 266.98 3488.02 2795.36 10
mvs_zhu_103090.16 1281.94 2195.63 594.87 1397.32 369.88 2994.00 894.70 16
3Dnovator96.53 293.94 190.03 396.55 296.08 896.95 485.43 394.62 696.62 3
CasMVSNet(base)85.15 2874.67 3792.14 2388.81 3196.80 562.49 4386.85 3190.81 34
CasMVSNet(SR_A)84.65 3176.33 3690.20 2984.56 3796.27 665.84 3786.81 3289.77 38
OpenMVScopyleft94.22 891.21 1086.10 994.61 893.08 2095.94 779.58 1092.61 1194.82 15
LPCS88.71 2280.50 2494.18 1391.38 2495.63 875.15 2085.84 3695.53 9
Pnet-blend++89.08 1785.48 1191.49 2594.54 1595.61 978.52 1192.44 1284.31 42
Pnet-blend89.08 1785.48 1191.49 2594.54 1595.61 978.52 1192.44 1284.31 42
3Dnovator+96.13 392.29 587.63 595.40 694.86 1495.31 1182.29 692.97 996.03 5
LTVRE_ROB96.88 191.35 786.15 894.81 795.87 994.76 1285.06 487.25 3093.80 21
Andreas Kuhn, Heiko Hirschmüller, Daniel Scharstein, Helmut Mayer: A TV Prior for High-Quality Scalable Multi-View Stereo Reconstruction. International Journal of Computer Vision 2016
GSE86.24 2677.00 3292.39 2289.33 3093.63 1371.06 2782.94 4294.22 19
AttMVS91.69 684.94 1596.20 399.02 193.24 1477.86 1492.02 1696.33 4
PMVScopyleft89.60 1772.53 5140.67 7293.77 1692.28 2293.18 1548.78 5332.55 8095.84 6
Y. Furukawa, J. Ponce: Accurate, dense, and robust multiview stereopsis. PAMI (2010)
test_112691.28 986.68 794.36 1296.53 793.15 1681.50 791.85 1793.39 23
TAPA-MVS93.32 1289.43 1684.97 1492.40 2192.59 2192.76 1777.17 1692.77 1091.86 28
Andrea Romanoni, Matteo Matteucci: TAPA-MVS: Textureless-Aware PAtchMatch Multi-View Stereo. ICCV 2019
DeepC-MVS_fast94.34 788.57 2482.00 2092.95 1991.15 2792.65 1875.30 1988.70 2495.05 13
Andreas Kuhn, Christian Sormann, Mattia Rossi, Oliver Erdler, Friedrich Fraundorfer: DeepC-MVS: Deep Confidence Prediction for Multi-View Stereo Reconstruction. 3DV 2020
tm-dncc88.78 2080.08 2594.58 995.72 1092.39 1972.62 2487.54 2895.64 8
tmmvs91.01 1185.87 1094.43 1095.72 1092.39 1980.42 891.32 1895.17 12
DeepC-MVS95.41 489.62 1485.22 1392.56 2091.17 2692.38 2178.02 1392.43 1494.13 20
Andreas Kuhn, Christian Sormann, Mattia Rossi, Oliver Erdler, Friedrich Fraundorfer: DeepC-MVS: Deep Confidence Prediction for Multi-View Stereo Reconstruction. 3DV 2020
DeepPCF-MVS94.58 589.02 1982.96 1993.05 1791.93 2392.37 2276.15 1889.78 2294.87 14
ANet-0.7570.67 5247.80 6585.91 3981.25 4792.34 2338.12 6357.47 6384.15 44
ANet78.35 4467.13 4785.83 4081.24 4892.34 2357.90 4776.37 4883.92 45
P-MVSNet88.77 2181.05 2293.92 1495.26 1291.17 2572.77 2389.32 2395.32 11
COLMAP_ROBcopyleft94.48 685.43 2778.72 2789.90 3084.17 3991.10 2668.97 3288.47 2694.42 18
Johannes L. Schönberger, Enliang Zheng, Marc Pollefeys, Jan-Michael Frahm: Pixelwise View Selection for Unstructured Multi-View Stereo. ECCV 2016
vp_mvsnet79.40 4361.80 4991.14 2893.13 1989.83 2728.94 7694.66 490.45 37
HY-MVS91.43 1588.66 2384.44 1691.47 2791.38 2489.67 2876.53 1792.35 1593.36 24
Pnet-new-92.40 489.38 494.42 1197.52 588.91 2983.32 595.44 196.83 2
COLMAP(base)85.07 2978.37 2889.55 3286.00 3388.85 3071.31 2685.43 3893.78 22
TAPA-MVS(SR)89.94 1387.22 691.74 2493.91 1888.71 3180.31 994.13 792.61 25
test_120576.84 4569.16 4681.96 4582.96 4388.41 3262.91 4275.41 5074.53 55
Pnet_fast79.81 4172.73 4284.52 4382.40 4488.35 3358.67 4586.79 3382.82 49
ACMP92.54 1380.70 3971.88 4386.59 3780.49 4988.25 3463.03 4180.72 4491.01 32
Qingshan Xu and Wenbing Tao: Planar Prior Assisted PatchMatch Multi-View Stereo. AAAI 2020
ACMM93.33 1181.98 3674.23 3887.15 3682.31 4588.08 3565.33 3883.12 4191.07 31
Qingshan Xu and Wenbing Tao: Multi-Scale Geometric Consistency Guided Multi-View Stereo. CVPR 2019
COLMAP(SR)84.90 3079.01 2688.83 3486.24 3287.81 3672.86 2285.16 3992.45 26
PCF-MVS89.43 1882.97 3373.73 3989.13 3390.40 2887.68 3768.35 3379.10 4589.30 39
Andreas Kuhn, Shan Lin, Oliver Erdler: Plane Completion and Filtering for Multi-View Stereo Reconstruction. GCPR 2019
PVSNet_LR81.05 3872.92 4186.47 3884.32 3887.64 3858.56 4687.27 2987.46 40
PLCcopyleft91.02 1683.72 3277.86 2987.63 3583.31 4187.52 3970.85 2884.87 4092.08 27
Jie Liao, Yanping Fu, Qingan Yan, Chunxia xiao: Pyramid Multi-View Stereo with Local Consistency. Pacific Graphics 2019
test_1120copyleft93.24 393.60 293.00 1897.25 685.92 4091.86 295.34 295.83 7
test_112493.84 293.94 193.78 1598.74 285.65 4193.23 194.65 596.94 1
A-TVSNet + Gipumacopyleft72.66 5060.91 5180.50 4875.15 5584.76 4259.05 4462.76 5981.58 51
OpenMVS_ROBcopyleft91.80 1487.38 2583.78 1889.79 3194.43 1783.34 4377.69 1589.88 2191.59 30
MVSCRF75.50 4869.78 4579.31 5177.67 5482.99 4449.66 5189.90 2077.28 53
R-MVSNet80.37 4076.91 3382.68 4483.10 4281.84 4565.20 3988.62 2583.11 47
CIDER81.10 3781.01 2381.16 4778.95 5181.68 4671.35 2590.68 1982.84 48
Qingshan Xu and Wenbing Tao: Learning Inverse Depth Regression for Multi-View Stereo with Correlation Cost Volume. AAAI 2020
ACMH+93.58 1082.41 3477.68 3085.56 4285.31 3580.89 4769.59 3085.78 3790.47 36
IB-MVS85.98 2079.62 4276.62 3481.62 4685.10 3680.82 4866.55 3586.69 3578.96 52
Christian Sormann, Mattia Rossi, Andreas Kuhn and Friedrich Fraundorfer: IB-MVS: An Iterative Algorithm for Deep Multi-View Stereo based on Binary Decisions. BMVC 2021
ACMH93.61 982.01 3576.51 3585.67 4185.78 3480.27 4966.29 3686.73 3490.97 33
Qingshan Xu and Wenbing Tao: Multi-Scale Geometric Consistency Guided Multi-View Stereo. CVPR 2019
CMPMVSbinary73.10 2325.96 831.21 8442.46 6848.37 6879.01 502.43 850.00 840.00 85
M. Jancosek, T. Pajdla: Multi-View Reconstruction Preserving Weakly-Supported Surfaces. CVPR 2011
PVSNet_081.89 2175.64 4769.86 4479.48 5078.69 5274.99 5163.13 4076.60 4784.77 41
BP-MVSNet73.12 4963.32 4879.66 4981.44 4674.39 5252.82 4873.82 5183.14 46
Christian Sormann, Patrick Knöbelreiter, Andreas Kuhn, Mattia Rossi, Thomas Pock, Friedrich Fraundorfer: BP-MVSNet: Belief-Propagation-Layers for Multi-View-Stereo. 3DV 2020
unsupervisedMVS_cas61.20 5449.84 6368.78 5463.77 5973.15 5338.43 6261.26 6069.42 56
PVSNet86.72 1976.20 4673.67 4077.88 5279.20 5072.57 5469.58 3177.77 4681.87 50
test_mvsss46.43 6532.75 8055.54 6153.54 6561.33 5519.75 8145.75 7551.76 68
MVS_test_160.08 5650.72 5966.32 5578.10 5352.08 5625.70 7975.74 4968.79 57
hgnet44.78 6646.10 6643.90 6555.34 6351.87 5737.81 6454.38 6524.49 80
DPSNet44.78 6646.10 6643.90 6555.34 6351.87 5737.81 6454.38 6524.49 80
example36.07 8028.13 8241.37 7057.61 6248.26 5928.65 7827.61 8218.25 82
Pnet-eth61.19 5557.15 5563.88 5683.71 4047.10 6041.45 5972.85 5360.82 60
CCVNet46.86 6439.85 7651.54 6250.35 6746.44 6136.28 6843.42 7757.82 62
MVSNet_plusplus65.61 5351.21 5875.21 5389.61 2944.24 6219.91 8082.51 4391.77 29
CPR_FA50.17 6152.38 5648.69 6444.83 7042.31 6343.51 5861.26 6058.94 61
MVEpermissive73.61 2242.73 6850.51 6037.54 7728.32 8341.81 6447.01 5554.00 6742.50 73
Simon Fuhrmann, Fabian Langguth, Michael Goesele: MVE - A Multi-View Reconstruction Environment. EUROGRAPHICS Workshops on Graphics and Cultural Heritage (2014)
F/T MVSNet+Gipuma59.72 5761.47 5058.55 5770.70 5640.01 6552.07 4970.87 5464.95 58
unMVSmet49.47 6358.19 5343.66 6750.58 6639.49 6647.78 5468.60 5540.90 74
MVSNet + Gipuma58.38 5860.00 5257.30 5868.90 5839.34 6751.70 5068.30 5663.65 59
RMVSNet42.28 7049.56 6437.43 7846.49 6936.28 6846.14 5652.98 6829.51 77
Snet53.20 5957.22 5450.52 6362.60 6036.23 6949.14 5265.30 5852.73 66
TVSNet41.69 7143.53 6840.47 7334.72 7435.30 7036.98 6650.08 6951.38 69
unMVSv130.79 8229.07 8131.95 8233.31 7535.18 7128.91 7729.23 8127.34 79
MVSNet_++50.06 6240.45 7356.46 5957.82 6134.95 727.47 8473.43 5276.61 54
test340.01 7642.34 6938.45 7533.05 7634.68 7336.66 6748.03 7047.61 70
SVVNet40.48 7440.38 7440.54 7130.67 8134.64 7433.96 7146.80 7256.31 64
ternet40.48 7440.38 7440.54 7130.67 8134.64 7433.96 7146.80 7256.31 64
QQQNet41.37 7240.86 7141.70 6932.74 7834.55 7634.93 7046.80 7257.82 62
SGNet36.94 7737.96 7736.27 7931.95 7933.84 7732.37 7343.54 7643.01 72
firsttry36.63 7933.24 7938.88 7437.30 7333.37 7830.72 7435.75 7945.97 71
PSD-MVSNet35.22 8135.50 7835.04 8031.65 8032.64 7929.99 7541.01 7840.82 75
Cas-MVS_preliminary50.29 6041.63 7056.06 6043.18 7130.41 8035.30 6947.96 7194.60 17
confMetMVS40.92 7349.99 6234.88 8142.05 7228.82 8145.31 5754.67 6433.76 76
A1Net42.71 6950.00 6137.85 7632.83 7728.57 8241.36 6058.63 6252.14 67
metmvs_fine36.70 7852.31 5726.30 8326.70 8424.23 8338.64 6165.98 5727.95 78
test_robustmvs7.77 8412.93 856.88 8412.31 833.50 83
FADENet1.00 841.26 830.82 851.49 860.55 851.56 860.96 830.41 84
dnet0.00 850.00 850.00 860.00 870.00 860.00 870.00 840.00 85
test_MVS18.95 82
UnsupFinetunedMVSNet70.70 56