Articles | Volume 15, issue 1
https://doi.org/10.5194/essd-15-113-2023
https://doi.org/10.5194/essd-15-113-2023
Data description paper
 | 
09 Jan 2023
Data description paper |  | 09 Jan 2023

MDAS: a new multimodal benchmark dataset for remote sensing

Jingliang Hu, Rong Liu, Danfeng Hong, Andrés Camero, Jing Yao, Mathias Schneider, Franz Kurz, Karl Segl, and Xiao Xiang Zhu

Viewed

Total article views: 4,614 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
3,173 1,379 62 4,614 53 34
  • HTML: 3,173
  • PDF: 1,379
  • XML: 62
  • Total: 4,614
  • BibTeX: 53
  • EndNote: 34
Views and downloads (calculated since 29 Aug 2022)
Cumulative views and downloads (calculated since 29 Aug 2022)

Viewed (geographical distribution)

Total article views: 4,614 (including HTML, PDF, and XML) Thereof 4,507 with geography defined and 107 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 

Cited

Latest update: 19 Apr 2024
Download
Short summary
Multimodal data fusion is an intuitive strategy to break the limitation of individual data in Earth observation. Here, we present a multimodal data set, named MDAS, consisting of synthetic aperture radar (SAR), multispectral, hyperspectral, digital surface model (DSM), and geographic information system (GIS) data for the city of Augsburg, Germany, along with baseline models for resolution enhancement, spectral unmixing, and land cover classification, three typical remote sensing applications.
Altmetrics
Final-revised paper
Preprint