Articles | Volume 18, issue 5
https://doi.org/10.5194/essd-18-3069-2026
© Author(s) 2026. This work is distributed under the Creative Commons Attribution 4.0 License.
CropSight-US: an object-based crop type ground truth dataset using street view and Sentinel-2 satellite imagery across the contiguous United States, 2013–2023
Download
- Final revised paper (published on 08 May 2026)
- Preprint (discussion started on 03 Nov 2025)
Interactive discussion
Status: closed
Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor
| : Report abuse
- RC1: 'Comment on essd-2025-527', Anonymous Referee #1, 07 Dec 2025
- RC2: 'Comment on essd-2025-527', Anonymous Referee #2, 02 Feb 2026
- AC1: 'Comment on essd-2025-527', Chunyuan Diao, 03 Mar 2026
Peer review completion
AR – Author's response | RR – Referee report | ED – Editor decision | EF – Editorial file upload
AR by Chunyuan Diao on behalf of the Authors (03 Mar 2026)
Author's response
Author's tracked changes
Manuscript
ED: Referee Nomination & Report Request started (11 Mar 2026) by Peng Zhu
RR by Anonymous Referee #3 (20 Mar 2026)
RR by Jiaqi Yang (23 Mar 2026)
ED: Publish subject to minor revisions (review by editor) (09 Apr 2026) by Peng Zhu
AR by Chunyuan Diao on behalf of the Authors (19 Apr 2026)
Author's response
Author's tracked changes
Manuscript
ED: Publish as is (21 Apr 2026) by Peng Zhu
AR by Chunyuan Diao on behalf of the Authors (28 Apr 2026)
This paper proposed an object-based crop type ground truth dataset from 2013 to 2023, called “CropSight-US”, using street view and Sentinel-2 images. This dataset covered 17 major crop types across 294 Agricultural Statistics Districts (ASDs) in CONUS. Specifically, crop type labels were extracted from Google street view images and field boundary information was derived from Sentinel-2 imagery. CONUS-UncertainFusionNet was developed to generate this dataset with uncertainty information. In the experimental results, the performance of several deep learning-based networks was compared. This paper is well-organized and the reviewer has the following detailed comments: