Articles | Volume 15, issue 1
https://doi.org/10.5194/essd-15-113-2023
https://doi.org/10.5194/essd-15-113-2023
Data description article
 | 
09 Jan 2023
Data description article |  | 09 Jan 2023

MDAS: a new multimodal benchmark dataset for remote sensing

Jingliang Hu, Rong Liu, Danfeng Hong, Andrés Camero, Jing Yao, Mathias Schneider, Franz Kurz, Karl Segl, and Xiao Xiang Zhu

Viewed

Total article views: 13,855 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
8,666 5,017 172 13,855 186 193
  • HTML: 8,666
  • PDF: 5,017
  • XML: 172
  • Total: 13,855
  • BibTeX: 186
  • EndNote: 193
Views and downloads (calculated since 29 Aug 2022)
Cumulative views and downloads (calculated since 29 Aug 2022)

Viewed (geographical distribution)

Total article views: 13,855 (including HTML, PDF, and XML) Thereof 13,646 with geography defined and 209 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 

Cited

Saved (final revised paper)

Latest update: 06 May 2026
Download
Short summary
Multimodal data fusion is an intuitive strategy to break the limitation of individual data in Earth observation. Here, we present a multimodal data set, named MDAS, consisting of synthetic aperture radar (SAR), multispectral, hyperspectral, digital surface model (DSM), and geographic information system (GIS) data for the city of Augsburg, Germany, along with baseline models for resolution enhancement, spectral unmixing, and land cover classification, three typical remote sensing applications.
Share
Altmetrics
Final-revised paper
Preprint