This table lists the benchmark results for the low-res many-view scenario. The following metrics are evaluated:

(*) For exact definitions, detailing how potentially incomplete ground truth is taken into account, see our paper.

The datasets are grouped into different categories, and result averages are computed for a category and method if results of the method are available for all datasets within the category. Note that the category "all" includes both the high-res multi-view and the low-res many-view scenarios.

Methods with suffix _ROB may participate in the Robust Vision Challenge.

Click a dataset result cell to show a visualization of the reconstruction. For training datasets, ground truth and accuracy / completeness visualizations are also available. The visualizations may not work with mobile browsers.




Method Infoalllow-res
many-view
indooroutdoorlakesidesand boxstorage roomstorage room 2tunnel
sort bysort bysort bysort bysort bysorted bysort bysort bysort by
DeepPCF-MVS89.96 179.79 172.17 184.87 184.94 185.37 166.97 177.38 184.30 1
DeepC-MVS_fast89.43 278.34 269.56 484.20 284.26 284.39 264.46 374.66 583.95 3
Andreas Kuhn, Christian Sormann, Mattia Rossi, Oliver Erdler, Friedrich Fraundorfer: DeepC-MVS: Deep Confidence Prediction for Multi-View Stereo Reconstruction. 3DV 2020
DeepC-MVS88.79 377.94 369.90 383.31 382.66 783.75 364.27 475.52 383.50 4
Andreas Kuhn, Christian Sormann, Mattia Rossi, Oliver Erdler, Friedrich Fraundorfer: DeepC-MVS: Deep Confidence Prediction for Multi-View Stereo Reconstruction. 3DV 2020
3Dnovator+87.14 474.49 763.75 1581.65 483.81 382.92 460.57 1166.93 1978.22 16
3Dnovator86.66 574.48 864.71 1081.00 782.81 582.90 561.47 867.94 1577.28 17
OpenMVScopyleft83.78 1171.88 1761.70 2078.66 1478.88 1281.24 656.67 1866.73 2075.86 20
PCF-MVS84.11 1075.39 666.57 781.27 683.29 480.62 762.19 670.95 779.89 12
Andreas Kuhn, Shan Lin, Oliver Erdler: Plane Completion and Filtering for Multi-View Stereo Reconstruction. GCPR 2019
ACMP84.23 872.76 1462.76 1779.42 1176.30 2180.59 854.92 2070.60 881.37 7
Qingshan Xu and Wenbing Tao: Planar Prior Assisted PatchMatch Multi-View Stereo. AAAI 2020
TAPA-MVS84.62 676.12 470.08 280.15 880.96 880.06 963.78 576.37 279.45 15
Andrea Romanoni, Matteo Matteucci: TAPA-MVS: Textureless-Aware PAtchMatch Multi-View Stereo. ICCV 2019
ACMM84.12 972.07 1662.37 1978.54 1575.71 2479.93 1054.53 2170.21 1079.97 11
Qingshan Xu and Wenbing Tao: Multi-Scale Geometric Consistency Guided Multi-View Stereo. CVPR 2019
tm-dncc76.08 567.89 581.54 580.59 979.92 1161.03 1074.75 484.11 2
tmmvs68.93 2155.16 2578.11 1780.59 979.92 1153.69 2356.64 2773.84 23
COLMAP(base)74.23 965.63 979.96 1078.64 1379.30 1361.52 769.74 1181.95 5
COLMAP_ROBcopyleft80.39 1671.80 1862.64 1877.90 1874.64 2578.97 1457.39 1767.89 1680.10 10
Johannes L. Schönberger, Enliang Zheng, Marc Pollefeys, Jan-Michael Frahm: Pixelwise View Selection for Unstructured Multi-View Stereo. ECCV 2016
PLCcopyleft84.53 773.65 1066.64 678.33 1675.81 2378.67 1561.39 971.89 680.49 9
Jie Liao, Yanping Fu, Qingan Yan, Chunxia xiao: Pyramid Multi-View Stereo with Local Consistency. Pacific Graphics 2019
COLMAP(SR)73.53 1163.77 1480.04 979.79 1178.44 1660.49 1267.05 1881.87 6
GSE73.02 1364.14 1278.95 1377.59 1678.43 1759.39 1468.88 1380.83 8
CasMVSNet(SR_A)56.09 3736.56 4469.11 3367.36 3477.93 1828.99 4544.12 4662.05 37
CasMVSNet(base)55.02 4135.18 4568.25 3567.01 3776.54 1927.28 4743.09 4761.20 40
A-TVSNet + Gipumacopyleft63.67 2650.37 3272.54 2669.76 3176.21 2048.72 3052.03 3671.64 26
TAPA-MVS(SR)73.30 1264.27 1179.32 1282.74 675.33 2159.44 1369.10 1279.88 13
test_112661.29 3144.16 3672.71 2577.87 1575.17 2240.34 3447.98 4065.09 33
test_120558.34 3447.13 3465.81 3867.31 3674.82 2340.34 3453.91 3355.31 46
Pnet_fast50.82 4326.87 6166.78 3664.56 3874.79 2417.08 7236.66 5561.01 41
LTVRE_ROB82.13 1372.22 1566.24 876.21 2077.23 1774.60 2565.16 267.32 1776.79 19
Andreas Kuhn, Heiko Hirschmüller, Daniel Scharstein, Helmut Mayer: A TV Prior for High-Quality Scalable Multi-View Stereo Reconstruction. International Journal of Computer Vision 2016
LPCS68.73 2260.59 2174.16 2372.17 2774.57 2657.78 1663.39 2475.74 21
HY-MVS83.01 1268.57 2358.63 2375.20 2177.06 1873.92 2753.36 2463.89 2374.62 22
PVSNet_LR59.81 3345.46 3569.38 3267.49 3372.80 2835.88 3755.04 3167.85 31
CIDER63.41 2852.95 2870.39 3171.10 3072.79 2949.19 2856.71 2667.28 32
Qingshan Xu and Wenbing Tao: Learning Inverse Depth Regression for Multi-View Stereo with Correlation Cost Volume. AAAI 2020
mvs_zhu_103055.64 3840.20 4065.93 3763.79 3972.44 3031.72 4248.68 3861.57 39
IB-MVS80.51 1569.16 2063.18 1673.15 2476.86 2071.66 3156.10 1970.26 970.91 27
Christian Sormann, Mattia Rossi, Andreas Kuhn and Friedrich Fraundorfer: IB-MVS: An Iterative Algorithm for Deep Multi-View Stereo based on Binary Decisions. BMVC 2021
ACMH+81.04 1471.42 1963.93 1376.41 1978.38 1471.06 3258.99 1568.87 1479.80 14
ANet-0.7555.32 4041.08 3864.81 3961.58 4070.76 3334.14 4048.01 3962.08 36
ANet53.16 4238.86 4162.70 4161.58 4070.76 3334.74 3842.99 4855.74 45
Pnet-new-55.36 3932.32 5070.71 2971.82 2969.62 3532.54 4132.10 6370.69 28
PVSNet_073.20 2063.55 2751.85 3071.35 2772.10 2869.61 3647.57 3156.13 2872.33 25
ACMH80.38 1768.42 2459.50 2274.37 2277.00 1969.10 3754.34 2264.66 2177.02 18
Qingshan Xu and Wenbing Tao: Multi-Scale Geometric Consistency Guided Multi-View Stereo. CVPR 2019
OpenMVS_ROBcopyleft74.94 1963.99 2553.23 2771.15 2875.82 2268.57 3851.19 2755.27 3069.07 30
R-MVSNet56.72 3648.74 3362.04 4361.37 4266.93 3943.74 3353.73 3457.82 43
PVSNet78.82 1862.67 3053.41 2668.85 3469.68 3266.37 4051.28 2655.54 2970.50 29
AttMVS58.26 3551.80 3162.56 4258.33 4765.96 4149.04 2954.55 3263.38 35
BP-MVSNet63.26 2952.48 2970.45 3073.14 2664.36 4245.07 3259.89 2573.84 23
Christian Sormann, Patrick Knöbelreiter, Andreas Kuhn, Mattia Rossi, Thomas Pock, Friedrich Fraundorfer: BP-MVSNet: Belief-Propagation-Layers for Multi-View-Stereo. 3DV 2020
MVSNet49.13 4437.77 4256.70 4450.43 5263.92 4330.66 4344.88 4455.77 44
vp_mvsnet35.70 5815.34 7849.28 5244.92 5463.43 4412.58 7918.10 7739.50 56
P-MVSNet61.04 3258.45 2462.77 4060.78 4563.39 4552.67 2564.23 2264.13 34
test_112442.16 5021.84 7455.71 4753.19 4861.81 4620.50 6523.19 7552.14 47
Pnet-blend++44.30 4626.07 6356.45 4560.87 4361.19 4714.59 7537.56 5247.29 51
Pnet-blend44.30 4626.07 6356.45 4560.87 4361.19 4714.59 7537.56 5247.29 51
unsupervisedMVS_cas45.48 4532.83 4753.91 4951.79 5059.72 4924.95 5240.71 4950.24 48
MVSCRF41.80 5227.23 5951.52 5048.62 5358.89 5023.51 6130.95 6447.03 53
PMVScopyleft47.18 2235.10 5920.52 7644.82 5643.36 5658.67 5128.88 4612.16 8032.42 66
Y. Furukawa, J. Ponce: Accurate, dense, and robust multiview stereopsis. PAMI (2010)
CasMVSNet(SR_B)42.10 5137.13 4345.41 5543.33 5757.07 5229.42 4444.85 4535.82 63
test_mvsss37.72 5625.62 6745.78 5444.82 5554.43 5316.12 7335.11 5938.11 59
MVS_test_142.72 4932.68 4849.41 5158.80 4647.95 5419.50 7045.87 4341.47 55
hgnet30.74 6725.12 6934.49 6540.98 6047.29 5524.56 5525.68 7215.18 79
DPSNet30.74 6725.12 6934.49 6540.98 6047.29 5524.56 5525.68 7215.18 79
example27.92 7619.45 7733.56 6742.92 5846.29 5719.84 6619.07 7611.46 82
CCVNet29.61 7121.20 7535.21 6032.95 6740.98 5817.15 7125.24 7431.70 69
CPR_FA41.50 5340.70 3942.03 5837.78 6240.09 5934.69 3946.72 4248.23 50
Snet37.53 5726.97 6044.56 5751.19 5138.17 6019.63 6934.31 6044.33 54
MVEpermissive39.65 2330.69 6931.26 5130.31 7223.68 7836.67 6126.92 4835.60 5830.60 71
Simon Fuhrmann, Fabian Langguth, Michael Goesele: MVE - A Multi-View Reconstruction Environment. EUROGRAPHICS Workshops on Graphics and Cultural Heritage (2014)
CMPMVSbinary59.16 217.91 830.09 8413.12 833.14 8536.21 620.19 860.00 840.00 85
M. Jancosek, T. Pajdla: Multi-View Reconstruction Preserving Weakly-Supported Surfaces. CVPR 2011
MVSNet_plusplus43.14 4825.17 6855.12 4867.36 3436.20 6312.78 7837.55 5461.80 38
test333.53 6031.08 5235.17 6130.55 7035.73 6426.13 5136.03 5639.22 57
TVSNet32.96 6129.87 5435.02 6231.41 6935.01 6523.96 5835.79 5738.64 58
QQQNet30.24 7027.34 5832.18 7030.23 7134.61 6624.76 5329.92 6531.70 69
SGNet31.82 6229.50 5533.36 6829.84 7334.36 6724.75 5434.26 6135.90 62
PSD-MVSNet31.43 6429.01 5733.04 6929.84 7334.06 6824.49 5733.52 6235.22 64
F/T MVSNet+Gipuma31.25 6525.73 6634.93 6334.16 6433.91 6923.61 6027.84 6936.73 60
MVSNet + Gipuma31.15 6626.01 6534.58 6433.87 6633.39 7023.75 5928.26 6836.47 61
A1Net40.64 5442.66 3739.29 5935.99 6332.67 7136.23 3649.09 3749.23 49
unMVSv128.02 7526.35 6229.13 7630.17 7232.63 7226.14 5026.55 7024.58 74
MVSNet_++40.21 5529.12 5647.60 5352.26 4932.34 735.74 8452.50 3558.20 42
SVVNet28.03 7324.84 7130.16 7326.30 7631.88 7419.75 6729.92 6532.29 67
ternet28.03 7324.84 7130.16 7326.30 7631.88 7419.75 6729.92 6532.29 67
firsttry28.73 7223.72 7332.06 7131.74 6830.80 7621.15 6326.30 7133.65 65
RMVSNet26.78 7732.62 4922.89 7727.05 7526.01 7726.90 4938.35 5115.62 78
unMVSmet17.94 7914.89 7919.98 7919.45 7924.72 7813.90 7715.87 7815.75 77
test_1120copyleft17.74 8012.97 8020.92 7813.88 8222.05 7912.33 8013.61 7926.82 73
Cas-MVS_preliminary14.58 819.47 8217.99 8013.62 8318.53 8010.67 828.27 8221.82 75
Pnet-eth31.53 6334.06 4629.85 7541.28 5917.75 8120.71 6447.41 4130.53 72
confMetMVS13.05 8210.92 8114.47 8214.73 8117.12 8210.20 8311.64 8111.56 81
metmvs_fine22.70 7830.59 5317.43 8117.52 8016.63 8322.55 6238.64 5018.14 76
test_robustmvs7.41 8412.35 846.59 8410.94 813.30 83
FADENet0.56 840.62 830.51 850.86 860.42 850.78 850.45 830.27 84
dnet0.00 850.00 850.00 860.00 870.00 860.00 870.00 840.00 85
test_MVS16.11 74
UnsupFinetunedMVSNet34.16 64