Preprints
https://doi.org/10.5194/essd-2022-155
https://doi.org/10.5194/essd-2022-155
 
29 Aug 2022
29 Aug 2022
Status: this preprint is currently under review for the journal ESSD.

MDAS: A New Multimodal Benchmark Dataset for Remote Sensing

Jingliang Hu1, Rong Liu1, Danfeng Hong2, Andrés Camero2, Jing Yao3, Mathias Schneider2, Franz Kurz2, Karl Segl4, and Xiao Xiang Zhu1,2 Jingliang Hu et al.
  • 1Data Science in Earth Observation (SiPEO), Technical University of Munich (TUM), 80333 Munich, Germany
  • 2Remote Sensing Technology Institute (IMF), German Aerospace Center (DLR), 82234 Wessling, Germany
  • 3Key Laboratory of Digital Earth Science, Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, China
  • 4German Research Center for Geosciences (GFZ), Helmholtz Center Potsdam, Telegrafenberg A17, 14473 Potsdam, Germany

Abstract. In Earth observation, multimodal data fusion is an intuitive strategy to break the limitation of individual data. Complementary physical contents of data sources allow comprehensive and precise information retrieve. With current satellite missions, such as ESA Copernicus programme, various data will be accessible at an affordable cost. Future applications will have many options on data sources. Such privilege can be beneficial only if algorithms are ready to work with various data sources. However, current data fusion studies mostly focus on the fusion of two data sources. There are two reasons, first, different combinations of data sources face different scientific challenges. For example, the fusion of synthetic aperture radar (SAR) data and optical images needs to handle the geometric difference, while the fusion of hyperspectral and multispectral images deals with different resolutions on spatial and spectral domains. Second, nowadays, it is still both financially and labour expensive to acquire multiple data sources for the same region at the same time. In this paper, we provide the community a benchmark multimodal data set, MDAS, for the city of Augsburg, Germany. MDAS includes synthetic aperture radar (SAR) data, multispectral image, hyperspectral image, digital surface model (DSM), and geographic information system (GIS) data. All these data are collected on the same date, 7th May 2018. MDAS is a new benchmark data set that provides researchers rich options on data selections. In this paper, we run experiments for three typical remote sensing applications, namely, resolution enhancement, spectral unmixing, and land cover classification, on MDAS data set. Our experiments demonstrate the performance of representative state-of-the-art algorithms whose outcomes can sever as baselines for further studies.

Jingliang Hu et al.

Status: open (until 27 Oct 2022)

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse

Jingliang Hu et al.

Jingliang Hu et al.

Viewed

Total article views: 406 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
164 238 4 406 2 2
  • HTML: 164
  • PDF: 238
  • XML: 4
  • Total: 406
  • BibTeX: 2
  • EndNote: 2
Views and downloads (calculated since 29 Aug 2022)
Cumulative views and downloads (calculated since 29 Aug 2022)

Viewed (geographical distribution)

Total article views: 395 (including HTML, PDF, and XML) Thereof 395 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 20 Sep 2022
Download
Short summary
Multimodal data fusion is an intuitive strategy to break the limitation of individual data in Earth observation. Here, we present multimodal data set, named MDAS, consisting of synthetic aperture radar (SAR), multispectral, hyperspectral, digital surface model (DSM), and geographic information system (GIS) data for the city of Augsburg, Germany, along with baseline models for resolution enhancement, spectral unmixing, and land cover classification, three typical remote sensing applications.