the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
A consistent ocean oxygen profile dataset with new quality control and bias assessment
Abstract. The global ocean oxygen levels have declined in the past decades, posing threats to marine life and human society. High-quality and bias-free observations are crucial to understanding the ocean oxygen changes and assessing their impact. Here, we propose a new automated quality control procedure for ocean profile oxygen data. This procedure consists of a suite of nine quality checks, with outlier rejection thresholds being defined based on underlying statistics of the data. The procedure is applied to three main instrumentation types: bottle casts, CTD (Conductivity-Temperature-Depth) casts, and Argo profiling floats. Application of the quality control procedure to several manually quality-controlled datasets of good quality suggests the ability of the scheme to successfully identify outliers in the data. Collocated quality-controlled oxygen profiles obtained utilizing the Winkler titration method are used as unbiased references to estimate possible residual biases in the oxygen sensor data. The residual bias is negligible for electrochemical sensors typically used on CTD casts. We explain this as the consequence of adjusting to the concurrent sample Winkler data. However, our analysis finds a prevailing negative residual bias for the delayed-mode quality-controlled adjusted Argo profiling floats varying from -4 to -1 µmol kg-1 among the data adjusted by different Argo data assembly centers (DACs). The respective overall DAC-specific corrections are suggested. Applying the new QC procedure and bias adjustment resulted in a new global ocean oxygen dataset from 1920 to 2022 with consistent data quality across bottle samples, CTD casts, and Argo floats. The adjusted Argo profile data is available at the Marine Science Data Center of the Chinese Academy of Sciences (Gouretski et al., 2023, http://dx.doi.org/10.12157/IOCAS.20231208.001)
- Preprint
(9195 KB) - Metadata XML
- BibTeX
- EndNote
Status: closed
-
RC1: 'Comment on essd-2023-518', Anonymous Referee #1, 31 Jan 2024
General comments:
The manuscript is a valuable contribution to oceanographic research, especially in the context of understanding and monitoring ocean oxygen levels. It is critical to provide high-quality, bias-free ocean oxygen level data. This paper introduces a novel automated quality control procedure. The novel quality control procedure and bias assessment methodology have the potential to significantly enhance the reliability of ocean oxygen datasets. However, to fully realize its potential and solidify its standing as be a substantial contribution to the field, the manuscript would benefit from more rigorous validation, a detailed discussion of its broader implications, and a transparent discussion regarding the potential limitations.
It is good to know the data quality of these commonly used published datasets. Due to the large volume of oxygen profile data, the authors’ work is a lot and appreciated. However, as a data paper of ESSD, data quality control processes are not enough, it is also needed for the cost of ignoring these data bias. A discussion of the implications of this new dataset and quality control procedure on the broader field of oceanography would enrich the manuscript. Moreover, the methodology for handling anomalies in oxygen measurements, or 'spikes', needs clarity. The ocean's dynamic nature and the rapid measurement of oxygen profiles mean that spikes in oxygen levels due to abrupt changes in factors like nutrients, currents, or water masses are plausible. A detailed explanation of how these anomalies are approached and analyzed would provide valuable context and strengthen the trust in the methodology employed.
Specific comments:
Line 36-38: This is only true for coastal regions.
Line 153: So you used the same method to the oceanic oxygen distribution? Make it clear here.
Line 222-223: Why is multiple extrema unrealistic? Any mechanisms behind?
Citation: https://doi.org/10.5194/essd-2023-518-RC1 - AC1: 'Reply on RC1', Lijing Cheng, 14 Apr 2024
-
RC2: 'Comment on essd-2023-518', Anonymous Referee #2, 01 Mar 2024
The authors present an automated quality control procedure for oxygen profiles. The authors then compare oxygen profiles from optodes and electrodes to discrete bottle samples to assess any bias. The authors state that they have taken on this important work in order to provide high-quality bias free data for scientific community to look at global deoxygenation over time on bias-free oxygen data. However, they do not demonstrate that their data product improves our ability to look at global deoxygenation. This could be demonstrated by looking at oxygen data before and after applying their QA procedure and bias adjustment.
It is unclear how the presented QA procedure differs from the QA procedures implemented by each Argo DACs. Also, how do these 9 QA procedures differ from those outlined in QARTOD (https://repository.library.noaa.gov/view/noaa/18659)? I suspect that the first half of the paper could be significantly cut down by referencing these very similar methods and summarizing the results like in Table 2. From Table 2, I was surprised to find that ~65% of all CTD oxygen observations failed the “stuck value test” and that 60% of all CTD oxygen observations failed the “Local climatological range”. This requires further discussion.
This work builds off the author’s previous work creating temperature climatology from Argo floats. In this oxygen work, there is also an emphasis on Argo data but with comparison to CTD and Winkler data. The Winkler data is assumed to be accurate and it is unclear to me how the CTD (electrode sensor) comparison adds to the assessment of the Argo (optode sensor) oxygen data when both are compared to discrete oxygen samples (Winklers). And ultimately, since the authors are presenting quality controlled oxygen profiles from Argo assets I would remove the CTD profiles comparison from this paper, since the inclusion/discussion of the CTD oxygen data often seems like an afterthought.
What do the authors mean by not-dummy oxygen values? Lines 279-280
Why would such a high level of CTD profiles fail quality checks? This is confusing, especially with the abstract stating the residual bias is negligible for CTD oxygen casts.
Lines 341-343: I’m not sure how Winkler measurements can be considered bias free when the authors just talked about inter-cruise offsets in Section 5.
Comments on Figures
I generally found the figures hard to follow. Panels were often mislettered and colorbars were inconsistent and hard to follow throughout the manuscript. See Thyng et al 2015 (https://doi.org/10.5670/oceanog.2016.66) on improving color use in oceanographic sciences.
Figure 4: Not needed.
Figure 6: Mislettered panels.
Figure 12: Where these plots created from OSD data? Why were these depth horizons highlighted? G and K are labeled incorrectly according to the caption.
Figure 20: The panels are not lettered in alphabetical order
Throughout the manuscript there are a number of copy edits. The worst of which was the caption for Figure 19. Yearly number of BGC Argo profiles equipped with dfifreent tyees of opotde oxgyen sesnors (colored lines). Lightbulee shading corresponds to the total number of proifles: a) AOLM, b) Coriolis, c) JMA, d) CSIRO
Lastly, the data product is not available at the linked repository, could not be assessed as required for review for ESSD, and therefore results in my recommended rejection of this article.
Citation: https://doi.org/10.5194/essd-2023-518-RC2 - AC2: 'Reply on RC2', Lijing Cheng, 14 Apr 2024
Status: closed
-
RC1: 'Comment on essd-2023-518', Anonymous Referee #1, 31 Jan 2024
General comments:
The manuscript is a valuable contribution to oceanographic research, especially in the context of understanding and monitoring ocean oxygen levels. It is critical to provide high-quality, bias-free ocean oxygen level data. This paper introduces a novel automated quality control procedure. The novel quality control procedure and bias assessment methodology have the potential to significantly enhance the reliability of ocean oxygen datasets. However, to fully realize its potential and solidify its standing as be a substantial contribution to the field, the manuscript would benefit from more rigorous validation, a detailed discussion of its broader implications, and a transparent discussion regarding the potential limitations.
It is good to know the data quality of these commonly used published datasets. Due to the large volume of oxygen profile data, the authors’ work is a lot and appreciated. However, as a data paper of ESSD, data quality control processes are not enough, it is also needed for the cost of ignoring these data bias. A discussion of the implications of this new dataset and quality control procedure on the broader field of oceanography would enrich the manuscript. Moreover, the methodology for handling anomalies in oxygen measurements, or 'spikes', needs clarity. The ocean's dynamic nature and the rapid measurement of oxygen profiles mean that spikes in oxygen levels due to abrupt changes in factors like nutrients, currents, or water masses are plausible. A detailed explanation of how these anomalies are approached and analyzed would provide valuable context and strengthen the trust in the methodology employed.
Specific comments:
Line 36-38: This is only true for coastal regions.
Line 153: So you used the same method to the oceanic oxygen distribution? Make it clear here.
Line 222-223: Why is multiple extrema unrealistic? Any mechanisms behind?
Citation: https://doi.org/10.5194/essd-2023-518-RC1 - AC1: 'Reply on RC1', Lijing Cheng, 14 Apr 2024
-
RC2: 'Comment on essd-2023-518', Anonymous Referee #2, 01 Mar 2024
The authors present an automated quality control procedure for oxygen profiles. The authors then compare oxygen profiles from optodes and electrodes to discrete bottle samples to assess any bias. The authors state that they have taken on this important work in order to provide high-quality bias free data for scientific community to look at global deoxygenation over time on bias-free oxygen data. However, they do not demonstrate that their data product improves our ability to look at global deoxygenation. This could be demonstrated by looking at oxygen data before and after applying their QA procedure and bias adjustment.
It is unclear how the presented QA procedure differs from the QA procedures implemented by each Argo DACs. Also, how do these 9 QA procedures differ from those outlined in QARTOD (https://repository.library.noaa.gov/view/noaa/18659)? I suspect that the first half of the paper could be significantly cut down by referencing these very similar methods and summarizing the results like in Table 2. From Table 2, I was surprised to find that ~65% of all CTD oxygen observations failed the “stuck value test” and that 60% of all CTD oxygen observations failed the “Local climatological range”. This requires further discussion.
This work builds off the author’s previous work creating temperature climatology from Argo floats. In this oxygen work, there is also an emphasis on Argo data but with comparison to CTD and Winkler data. The Winkler data is assumed to be accurate and it is unclear to me how the CTD (electrode sensor) comparison adds to the assessment of the Argo (optode sensor) oxygen data when both are compared to discrete oxygen samples (Winklers). And ultimately, since the authors are presenting quality controlled oxygen profiles from Argo assets I would remove the CTD profiles comparison from this paper, since the inclusion/discussion of the CTD oxygen data often seems like an afterthought.
What do the authors mean by not-dummy oxygen values? Lines 279-280
Why would such a high level of CTD profiles fail quality checks? This is confusing, especially with the abstract stating the residual bias is negligible for CTD oxygen casts.
Lines 341-343: I’m not sure how Winkler measurements can be considered bias free when the authors just talked about inter-cruise offsets in Section 5.
Comments on Figures
I generally found the figures hard to follow. Panels were often mislettered and colorbars were inconsistent and hard to follow throughout the manuscript. See Thyng et al 2015 (https://doi.org/10.5670/oceanog.2016.66) on improving color use in oceanographic sciences.
Figure 4: Not needed.
Figure 6: Mislettered panels.
Figure 12: Where these plots created from OSD data? Why were these depth horizons highlighted? G and K are labeled incorrectly according to the caption.
Figure 20: The panels are not lettered in alphabetical order
Throughout the manuscript there are a number of copy edits. The worst of which was the caption for Figure 19. Yearly number of BGC Argo profiles equipped with dfifreent tyees of opotde oxgyen sesnors (colored lines). Lightbulee shading corresponds to the total number of proifles: a) AOLM, b) Coriolis, c) JMA, d) CSIRO
Lastly, the data product is not available at the linked repository, could not be assessed as required for review for ESSD, and therefore results in my recommended rejection of this article.
Citation: https://doi.org/10.5194/essd-2023-518-RC2 - AC2: 'Reply on RC2', Lijing Cheng, 14 Apr 2024
Data sets
A quality-controlled and bias-adjusted global ocean oxygen profile dataset V. Gouretski et al. http://dx.doi.org/10.12157/IOCAS.20231208.001
Viewed
HTML | XML | Total | BibTeX | EndNote | |
---|---|---|---|---|---|
836 | 232 | 52 | 1,120 | 42 | 53 |
- HTML: 836
- PDF: 232
- XML: 52
- Total: 1,120
- BibTeX: 42
- EndNote: 53
Viewed (geographical distribution)
Country | # | Views | % |
---|
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1