Articles | Volume 15, issue 1
https://doi.org/10.5194/essd-15-113-2023
https://doi.org/10.5194/essd-15-113-2023
Data description paper
 | 
09 Jan 2023
Data description paper |  | 09 Jan 2023

MDAS: a new multimodal benchmark dataset for remote sensing

Jingliang Hu, Rong Liu, Danfeng Hong, Andrés Camero, Jing Yao, Mathias Schneider, Franz Kurz, Karl Segl, and Xiao Xiang Zhu

Download

Interactive discussion

Status: closed

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • RC1: 'Comment on essd-2022-155', Anonymous Referee #1, 29 Sep 2022
    • AC1: 'Reply on RC1', Andrés Camero, 11 Oct 2022
  • RC2: 'Comment on essd-2022-155', Anonymous Referee #2, 24 Oct 2022

Peer review completion

AR: Author's response | RR: Referee report | ED: Editor decision | EF: Editorial file upload
AR by Andrés Camero on behalf of the Authors (06 Dec 2022)  Author's response   Author's tracked changes   Manuscript 
ED: Publish subject to technical corrections (12 Dec 2022) by Martin Schultz
AR by Andrés Camero on behalf of the Authors (13 Dec 2022)  Author's response   Manuscript 
Download
Short summary
Multimodal data fusion is an intuitive strategy to break the limitation of individual data in Earth observation. Here, we present a multimodal data set, named MDAS, consisting of synthetic aperture radar (SAR), multispectral, hyperspectral, digital surface model (DSM), and geographic information system (GIS) data for the city of Augsburg, Germany, along with baseline models for resolution enhancement, spectral unmixing, and land cover classification, three typical remote sensing applications.
Altmetrics
Final-revised paper
Preprint