Articles | Volume 15, issue 1
https://doi.org/10.5194/essd-15-113-2023
https://doi.org/10.5194/essd-15-113-2023
Data description article
 | 
09 Jan 2023
Data description article |  | 09 Jan 2023

MDAS: a new multimodal benchmark dataset for remote sensing

Jingliang Hu, Rong Liu, Danfeng Hong, Andrés Camero, Jing Yao, Mathias Schneider, Franz Kurz, Karl Segl, and Xiao Xiang Zhu

Viewed

Total article views: 12,509 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
8,257 4,097 155 12,509 172 185
  • HTML: 8,257
  • PDF: 4,097
  • XML: 155
  • Total: 12,509
  • BibTeX: 172
  • EndNote: 185
Views and downloads (calculated since 29 Aug 2022)
Cumulative views and downloads (calculated since 29 Aug 2022)

Viewed (geographical distribution)

Total article views: 12,509 (including HTML, PDF, and XML) Thereof 12,178 with geography defined and 331 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 16 Mar 2026
Download
Short summary
Multimodal data fusion is an intuitive strategy to break the limitation of individual data in Earth observation. Here, we present a multimodal data set, named MDAS, consisting of synthetic aperture radar (SAR), multispectral, hyperspectral, digital surface model (DSM), and geographic information system (GIS) data for the city of Augsburg, Germany, along with baseline models for resolution enhancement, spectral unmixing, and land cover classification, three typical remote sensing applications.
Share
Altmetrics
Final-revised paper
Preprint