This table lists the benchmark results for the low-res many-view scenario. The following metrics are evaluated:

(*) For exact definitions, detailing how potentially incomplete ground truth is taken into account, see our paper.

The datasets are grouped into different categories, and result averages are computed for a category and method if results of the method are available for all datasets within the category. Note that the category "all" includes both the high-res multi-view and the low-res many-view scenarios.

Methods with suffix _ROB may participate in the Robust Vision Challenge.

Click a dataset result cell to show a visualization of the reconstruction. For training datasets, ground truth and accuracy / completeness visualizations are also available. The visualizations may not work with mobile browsers.




Method Infoalllow-res
many-view
indooroutdoorlakesidesand boxstorage roomstorage room 2tunnel
sort bysort bysort bysort bysort bysort bysort bysort bysorted by
P-MVSNet97.42 1794.13 2999.61 498.93 1699.94 190.70 2997.55 2999.94 1
AttMVS98.10 595.45 1699.86 199.84 199.87 393.29 1597.60 2799.87 2
3Dnovator99.15 298.46 396.92 599.48 799.05 1499.63 1495.44 698.41 1699.78 3
HY-MVS98.23 998.26 496.37 699.52 699.21 1199.65 1193.56 1399.18 299.69 4
3Dnovator+98.92 397.89 795.99 999.16 1298.56 2799.31 2194.11 997.86 2299.63 5
OpenMVScopyleft98.12 1097.86 895.83 1199.21 1198.55 2899.47 1893.67 1197.98 2099.61 6
MVSNet97.77 1194.82 2399.73 299.71 499.87 390.82 2898.81 999.61 6
mvs_zhu_103097.36 1894.00 3099.61 499.39 999.82 688.90 3799.10 399.60 8
test_112498.99 197.89 499.72 399.70 599.84 598.10 197.68 2499.60 8
vp_mvsnet92.70 5382.77 7199.32 999.36 1099.00 2466.62 7898.92 799.58 10
LPCS97.58 1395.27 1999.12 1398.44 3099.37 1993.66 1296.88 3699.56 11
TAPA-MVS(SR)97.85 996.03 899.07 1598.53 2999.13 2393.55 1498.51 1499.55 12
tmmvs97.58 1395.49 1498.98 1898.85 1898.54 2893.76 1097.22 3299.53 13
MVSNet_plusplus85.44 6375.33 7792.19 5799.09 1377.97 6554.19 8296.48 4399.49 14
DeepC-MVS_fast98.47 596.63 2594.29 2898.19 2897.66 3897.50 3891.63 2396.95 3499.41 15
Andreas Kuhn, Christian Sormann, Mattia Rossi, Oliver Erdler, Friedrich Fraundorfer: DeepC-MVS: Deep Confidence Prediction for Multi-View Stereo Reconstruction. 3DV 2020
PMVScopyleft92.94 2178.84 6950.69 8297.61 3495.96 5097.48 3965.74 7935.64 8299.38 16
Y. Furukawa, J. Ponce: Accurate, dense, and robust multiview stereopsis. PAMI (2010)
tm-dncc96.71 2293.39 3698.92 2198.85 1898.54 2889.98 3396.81 3799.36 17
ACMH98.42 695.63 3392.35 4397.81 3195.84 5198.24 3188.19 4296.51 4199.36 17
Qingshan Xu and Wenbing Tao: Multi-Scale Geometric Consistency Guided Multi-View Stereo. CVPR 2019
DeepC-MVS98.90 497.16 2095.29 1898.41 2698.16 3497.76 3492.20 2198.38 1799.32 19
Andreas Kuhn, Christian Sormann, Mattia Rossi, Oliver Erdler, Friedrich Fraundorfer: DeepC-MVS: Deep Confidence Prediction for Multi-View Stereo Reconstruction. 3DV 2020
PVSNet_LR97.57 1595.45 1698.98 1899.00 1598.67 2692.23 2098.67 1199.27 20
GSE96.89 2194.34 2798.60 2497.60 4098.97 2592.57 1996.11 4799.23 21
ACMH+98.40 895.77 3192.95 3797.64 3396.31 4997.47 4089.65 3596.26 4499.15 22
TAPA-MVS97.92 1397.67 1295.54 1399.09 1498.77 2099.34 2093.04 1698.04 1999.15 22
Andrea Romanoni, Matteo Matteucci: TAPA-MVS: Textureless-Aware PAtchMatch Multi-View Stereo. ICCV 2019
DeepPCF-MVS98.42 696.59 2694.42 2598.03 2997.86 3697.10 4191.48 2497.36 3099.14 24
Pnet-new-98.02 695.95 1099.40 899.49 899.61 1594.31 797.59 2899.11 25
OpenMVS_ROBcopyleft97.31 1797.22 1995.18 2098.58 2598.69 2497.95 3392.61 1897.75 2399.10 26
COLMAP(SR)96.17 2993.90 3297.69 3296.39 4897.64 3691.16 2696.64 3999.04 27
COLMAP_ROBcopyleft98.06 1294.72 4592.05 4496.50 4293.64 6196.86 4287.59 4596.51 4199.02 28
Johannes L. Schönberger, Enliang Zheng, Marc Pollefeys, Jan-Michael Frahm: Pixelwise View Selection for Unstructured Multi-View Stereo. ECCV 2016
CasMVSNet(SR_A)95.29 3689.69 5599.02 1698.59 2599.48 1783.78 5995.60 5098.99 29
ACMP97.51 1493.81 4989.89 5396.42 4393.98 5996.31 4785.39 5494.39 5898.97 30
Qingshan Xu and Wenbing Tao: Planar Prior Assisted PatchMatch Multi-View Stereo. AAAI 2020
COLMAP(base)95.23 3792.81 4096.85 3995.06 5596.52 4389.63 3695.99 4898.96 31
test_1120copyleft98.89 298.31 199.28 1099.17 1299.79 797.81 298.81 998.87 32
LTVRE_ROB99.19 197.54 1695.77 1298.71 2398.01 3599.31 2196.08 595.46 5398.82 33
Andreas Kuhn, Heiko Hirschmüller, Daniel Scharstein, Helmut Mayer: A TV Prior for High-Quality Scalable Multi-View Stereo Reconstruction. International Journal of Computer Vision 2016
R-MVSNet96.56 2794.43 2497.98 3097.61 3997.54 3790.39 3198.47 1598.78 34
PLCcopyleft97.35 1694.93 4092.81 4096.35 4494.63 5695.74 4990.06 3295.56 5298.69 35
Jie Liao, Yanping Fu, Qingan Yan, Chunxia xiao: Pyramid Multi-View Stereo with Local Consistency. Pacific Graphics 2019
ACMM98.09 1194.80 4291.85 4596.76 4095.11 5496.48 4487.94 4495.76 4998.69 35
Qingshan Xu and Wenbing Tao: Multi-Scale Geometric Consistency Guided Multi-View Stereo. CVPR 2019
CasMVSNet(base)94.85 4188.68 5798.96 2098.70 2399.67 882.37 6194.99 5598.50 37
BP-MVSNet96.44 2893.68 3398.28 2798.32 3198.10 3289.68 3497.67 2598.43 38
Christian Sormann, Patrick Knöbelreiter, Andreas Kuhn, Mattia Rossi, Thomas Pock, Friedrich Fraundorfer: BP-MVSNet: Belief-Propagation-Layers for Multi-View-Stereo. 3DV 2020
MVSCRF94.80 4292.89 3896.08 4693.77 6096.33 4686.18 5099.60 198.14 39
PCF-MVS96.03 1894.34 4790.51 5196.89 3796.93 4795.66 5087.59 4593.42 6198.08 40
Andreas Kuhn, Shan Lin, Oliver Erdler: Plane Completion and Filtering for Multi-View Stereo Reconstruction. GCPR 2019
F/T MVSNet+Gipuma91.90 5593.60 3490.76 5897.37 4377.09 6788.37 3898.83 897.83 41
MVS_test_187.21 6082.26 7290.51 5997.39 4276.51 6967.94 7796.58 4097.62 42
unMVSmet94.27 4890.80 4996.58 4198.30 3293.88 5486.93 4994.67 5697.57 43
CasMVSNet(SR_B)95.57 3490.42 5299.00 1799.52 799.94 184.68 5896.16 4597.54 44
test_112697.78 1096.13 798.89 2299.62 699.52 1694.30 897.95 2197.53 45
MVSNet + Gipuma91.67 5693.45 3590.48 6097.40 4176.54 6888.37 3898.52 1397.50 46
ANet-0.7592.64 5485.79 6397.20 3594.49 5799.66 979.52 6392.06 6497.46 47
Cas-MVS_preliminary86.14 6181.69 7389.11 6192.91 6277.14 6671.77 7491.61 6597.27 48
PVSNet_095.53 1994.76 4492.88 3996.02 4797.06 4594.33 5188.13 4397.63 2696.66 49
ANet94.58 4691.17 4796.86 3894.49 5799.66 985.16 5597.18 3396.43 50
A-TVSNet + Gipumacopyleft95.08 3993.98 3195.81 4995.21 5396.00 4891.24 2596.73 3896.22 51
confMetMVS93.00 5290.74 5094.52 5498.59 2589.04 5585.90 5295.57 5195.92 52
CPR_FA88.38 5788.27 5888.46 6288.24 6781.63 5987.03 4889.51 7395.52 53
MVSNet_++76.28 7363.91 8184.54 6587.95 6870.45 7233.49 8494.32 5995.21 54
PVSNet97.47 1595.18 3894.37 2695.72 5297.83 3794.18 5290.68 3098.06 1895.13 55
CIDER95.78 3095.47 1595.98 4897.01 4696.43 4591.90 2299.04 494.51 56
Qingshan Xu and Wenbing Tao: Learning Inverse Depth Regression for Multi-View Stereo with Correlation Cost Volume. AAAI 2020
Pnet_fast95.36 3592.80 4297.07 3698.75 2198.31 3088.27 4197.33 3194.14 57
unsupervisedMVS_cas87.73 5880.56 7592.51 5690.02 6694.04 5372.99 7288.12 7593.47 58
IB-MVS95.41 2095.66 3294.83 2296.21 4598.30 3297.74 3591.10 2798.57 1292.58 59
Christian Sormann, Mattia Rossi, Andreas Kuhn and Friedrich Fraundorfer: IB-MVS: An Iterative Algorithm for Deep Multi-View Stereo based on Binary Decisions. BMVC 2021
Pnet-eth93.70 5094.85 2192.94 5598.90 1788.78 5692.79 1796.91 3591.13 60
CCVNet78.62 7084.56 6574.66 7270.01 8063.28 7378.75 6490.36 7190.68 61
QQQNet76.16 7483.14 6971.50 7470.80 7653.03 7975.46 6990.83 6790.68 61
SVVNet75.58 7683.42 6770.36 7666.75 8254.11 7676.01 6790.83 6790.21 63
ternet75.58 7683.42 6770.36 7666.75 8254.11 7676.01 6790.83 6790.21 63
test_mvsss82.49 6873.62 7988.40 6390.79 6584.94 5759.37 8087.86 7689.47 65
metmvs_fine87.49 5989.83 5485.92 6491.28 6478.30 6483.54 6096.12 4688.18 66
Pnet-blend++96.65 2398.04 295.73 5099.77 299.65 1197.16 398.93 587.77 67
Pnet-blend96.65 2398.04 295.73 5099.77 299.65 1197.16 398.93 587.77 67
test_120593.55 5191.37 4695.00 5398.73 2298.60 2788.30 4094.43 5787.67 69
hgnet84.50 6486.06 6183.46 6785.83 7078.98 6284.69 5687.44 7885.56 70
DPSNet84.50 6486.06 6183.46 6785.83 7078.98 6284.69 5687.44 7885.56 70
example78.46 7169.47 8084.45 6687.90 6979.95 6169.21 7669.72 8185.51 72
Snet83.91 6689.25 5680.34 7095.66 5259.91 7585.81 5392.69 6285.46 73
TVSNet76.50 7285.63 6470.41 7572.28 7553.63 7877.80 6593.46 6085.31 74
RMVSNet85.99 6291.12 4882.56 6992.56 6371.89 7187.23 4795.02 5483.24 75
A1Net75.91 7586.79 6068.65 7976.15 7246.80 8381.23 6292.34 6383.01 76
firsttry75.37 7882.99 7070.28 7867.26 8162.42 7475.38 7090.60 7081.16 77
test374.14 7983.69 6667.77 8070.24 7952.13 8176.30 6691.09 6680.94 78
MVEpermissive92.54 2282.65 6787.52 5979.40 7175.10 7382.70 5885.94 5189.11 7480.40 79
Simon Fuhrmann, Fabian Langguth, Michael Goesele: MVE - A Multi-View Reconstruction Environment. EUROGRAPHICS Workshops on Graphics and Cultural Heritage (2014)
SGNet72.64 8181.38 7466.81 8170.44 7852.48 8072.96 7389.80 7277.50 80
PSD-MVSNet71.12 8279.18 7665.75 8270.74 7751.62 8270.68 7587.67 7774.90 81
unMVSv173.78 8074.27 7873.45 7374.10 7475.59 7073.49 7175.05 8070.66 82
test_robustmvs32.21 8449.22 8530.09 8447.07 8317.32 83
FADENet15.30 8410.26 8318.67 8528.34 8622.58 8510.01 8510.51 835.07 84
dnet0.00 850.00 850.00 860.00 870.00 860.00 870.00 840.00 85
CMPMVSbinary77.52 2327.18 831.74 8444.14 8351.76 8480.67 603.48 860.00 840.00 85
M. Jancosek, T. Pajdla: Multi-View Reconstruction Preserving Weakly-Supported Surfaces. CVPR 2011
test_MVS58.19 81
UnsupFinetunedMVSNet97.37 43