Articles | Volume 15, issue 1
https://doi.org/10.5194/essd-15-113-2023
https://doi.org/10.5194/essd-15-113-2023
Data description paper
 | 
09 Jan 2023
Data description paper |  | 09 Jan 2023

MDAS: a new multimodal benchmark dataset for remote sensing

Jingliang Hu, Rong Liu, Danfeng Hong, Andrés Camero, Jing Yao, Mathias Schneider, Franz Kurz, Karl Segl, and Xiao Xiang Zhu

Viewed

Total article views: 6,381 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
4,604 1,679 98 6,381 89 65
  • HTML: 4,604
  • PDF: 1,679
  • XML: 98
  • Total: 6,381
  • BibTeX: 89
  • EndNote: 65
Views and downloads (calculated since 29 Aug 2022)
Cumulative views and downloads (calculated since 29 Aug 2022)

Viewed (geographical distribution)

Total article views: 6,381 (including HTML, PDF, and XML) Thereof 6,177 with geography defined and 204 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 

Cited

Latest update: 13 Dec 2024
Download
Short summary
Multimodal data fusion is an intuitive strategy to break the limitation of individual data in Earth observation. Here, we present a multimodal data set, named MDAS, consisting of synthetic aperture radar (SAR), multispectral, hyperspectral, digital surface model (DSM), and geographic information system (GIS) data for the city of Augsburg, Germany, along with baseline models for resolution enhancement, spectral unmixing, and land cover classification, three typical remote sensing applications.
Altmetrics
Final-revised paper
Preprint