the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Enhanced automated meteorological observations at the Canadian Arctic Weather Science (CAWS) supersites
Laura Huang
Robert Crawford
Jean-Pierre Blanchet
Shannon Hicks-Jalali
Eva Mekis
Ludovick Pelletier
Peter Rodriguez
Kevin Strawbridge
Download
- Final revised paper (published on 11 Nov 2022)
- Preprint (discussion started on 10 Jun 2022)
Interactive discussion
Status: closed
-
RC1: 'Comment on essd-2022-174', Anonymous Referee #1, 23 Jun 2022
General Comments:
This is an Earth Systems Data Science paper that outlines observations collected at the Canadian Arctic Weather Science (CAWS) Supersites. This article is appropriate to support the publication of a data set. The data is significant and very useful to the Arctic scientific community. This was a very nice assessment and overview of the two supersites and the instrumentation. Great explanation and reasoning for why such Arctic measurements are needed in these remote, harsh Arctic environments. Your reasoning is reasonably broad and includes model validation and rescue effort impacts – two very important reasons for improving and continuing to collect Arctic measurements. I appreciate your noting these different and important needs for collecting data. This was also nicely highlighted in the special events that were notated at the end of the manuscript and tied everything together. Is there any hope of continuing these measurements beyond the identified June 2022 for Whitehorse, and partial continuation at Iqaluit?
Specific Comments:
- Do you have issues with the automation of the sites (such as, instruments freezing or accruing ice that might bias the measurements)? If so, how much delay is there between post processing the data and making it publicly available to forecasters?
- Have you considered the datagram structure for organizing and displaying instrument/facility metadata more broadly? A datagram would nicely format the instrument content of the tables and also the images from Figures 2, 3 and 5. (Morris, S., & Uttal, T. (2022). Datagrams: Diagrammatic Metadata for Humans, Bulletin of the American Meteorological Society, 103(5), E1343-E1350. Retrieved Jun 23, 2022, from https://journals.ametsoc.org/view/journals/bams/103/5/BAMS-D-21-0219.1.xml)
- Have you considered putting the data from these two super-sites into the IASOA data portal? (https://psl.noaa.gov/iasoa/home2)
- I see that you have installed some cameras onsite to determine weather conditions and to visually check the instruments. Do the resolution on the cameras allow for assessment of the radiometers and whether they have snow/ice on the domes. I see that you have installed an icing detector near the radiometer suite, but images might also be helpful. Also, are the images available in a dataset somewhere as well? Might be a nice additional dataset to assess how well instruments are holding up against the Arctic conditions and maintaining clear optical lenses/domes.
- Do site techs ever visit the sites to clean the instruments (radiometers) if needed? Is there a set schedule or if there an identifier in the data (or cameras) that initiates a site visit?
- Have you looked at the D-ICE experiment publication, with respect to maintaining clear domes from ice/snow/rime/etc. for the radiometers? (Cox, C. J., Morris, S. M., Uttal, T., Burgener, R., Hall, E., Kutchenreiter, M., . . . Wendell, J. (2021). The De-Icing Comparison Experiment (D-ICE): a study of broadband radiometric measurements under icing conditions in the Arctic. Meas. Tech., 14(2), 1205-1224. doi:10.5194/amt-14-1205-2021)
- Do you have a rotation schedule for instrument calibrations? I understand that techs are not on-site, but how often would someone go out to verify measurement accuracy (like with cleaning radiometer domes, etc.).
- Have you considered installing a tracker near the radiation suite? They do need much more maintenance and attention, so might be difficult considering these are automated sites.
- Is there any discussion about including ground heat flux plates or thermistor measurements at the sites? This might be a nice addition to round out surface energy budget terms, like conductive heat flux. Similarly, turbulent fluxes would also be a nice addition (though these would require more power and would transfer data at a much faster rate, so infrastructure might not allow for this).
- Have the quality control or post-processing techniques been posted in any reports that you could reference? Are you doing any post-processing or quality control efforts to the data before it gets posted to the portal or is the data mostly raw?
Technical Corrections:
- In Figure 1 the orange squares are difficult to decipher/locate on the maps. Including overlaid arrows on the maps with short captions/acronyms could help to better determine the locations.
Citation: https://doi.org/10.5194/essd-2022-174-RC1 -
RC2: 'Comment on essd-2022-174', Anonymous Referee #2, 06 Jul 2022
The manuscript presents monitoring data from two meteorological superstations located in the Iqaluit and Whitehorse of Canada, which are located in Provincial/Territorial capitals and are economic hubs for the region. Both sites act as transportation gateways to the North and are in the path of several common Arctic storm tracks. And more importantly, they are also uniquely situated in close proximity to frequent overpasses by polar-orbiting satellites. The manuscript is generally well organized and clear to me. As a researcher focusing on the permafrost dynamics, I appreciate the authors' considerable efforts for meteorological monitoring in the Arctic. I present my major concerns as follows,
- The introduction section should be re-organized. The reasons why the observation data was needed are not presented. Besides, the authors did not summarize any previous works on the meteorological observations at the Canadian Arctic weather science. And the necessity of both sites and their representativeness need to be clearly stated.
- Figure 1 should be clearer. The current remote sensing images are too large in scope, resulting in key information (meteorological station locations) not being highlighted enough. I suggest the authors to use red square instead of orange square. In addition, it would be better for the study area overview map to include latitude and longitude information.
- In Section 2: The author should describe the underlying surface conditions of the two superstations separately.
- A separate section should be added to the manuscript to describe the rules for data storage, such as the way to note different levels of data quality and missing values in the data storage file (Boiker et al., 2018).
- The context includes too much content on the observation instruments while the the accuracy, quality and duration of the observations is not clear.
- The author introduced only short period in sample of Meteorological Data during High-Impact Weather Events, which were Iqaluit blizzard: November 23 2018 and Whitehorse blizzard: Dec 16-17 2019. If the continuous observation for both sites especially from 2018 to 2021 could be presented, the manuscript will be expected to receive greater concerns. 5. It can be seen from Tables 1 and 2 that various monitoring data have time series of more than 3 years, but the Figures in the manuscript select a certain period when displaying the time series of various monitoring data. Authors are advised to present a complete presentation of the monitoring data. In addition, did the authors find some interesting patterns of variation based on the time series of the various monitoring data? If so, please present them appropriately.
- I don’t understand the classification method for the light precipitation, moderate precipitation and heavy precipitation, please add this definition. How to obtain credible precipitation type? Such as rain, snow, sleet. In general, different discriminating method have different credibility in certain region, the method in this manuscript should be explained clearly. In addition, the precipitation type that mixed was used in some figures, what are the differences between sleet and mixed?
- Are the thermistors shifting during the monitoring period? Are they calibrated every year or at a constant frequency in lab? Modern sensors and transmitters are electronic devices, and the reference voltage, or signal, may drift over time due to temperature, pressure, or change in ambient conditions.
- The text in all the diagrams in the manuscript is very small and not conducive to reading.
Citation: https://doi.org/10.5194/essd-2022-174-RC2 -
AC1: 'Comment on essd-2022-174', Zen Mariani, 24 Aug 2022
Author’s Response
August 24, 2022
Thank you for your work in helping us improve the manuscript. We have responded to all comments below and outlined changes in the manuscript. Changes highlighted in yellow correspond to Reviewer 1; changes highlighted in green correspond to Reviewer 2.
Response to Reviewer 1
- Is there any hope of continuing these measurements beyond the identified June 2022 for Whitehorse, and partial continuation at Iqaluit?
- Yes; continued observations at Iqaluit are planned and instrument repairs are underway. No continuation for Whitehorse is planned as the site has been decommissioned and the airport has started construction of a new building at that location.
- Do you have issues with the automation of the sites (such as, instruments freezing or accruing ice that might bias the measurements)? If so, how much delay is there between post processing the data and making it publicly available to forecasters?
- This is mitigated as much as possible by the instruments’ design. All instruments have technical performance ratings suitable for Arctic conditions; as such, they are equipped with heaters, fans, and wiper blades as necessary. Remote monitoring via 4K cameras enabled visual confirmation of the absence of snow or ice accumulation. These details have been added to the paper in Sect. 3.1. While post-processed products are made available in near-real time via obrs.ca, the actual raw data files experience significant delay before making it available on the data portal (at times up to almost a year) due to bandwidth limitations.
- Have you considered the datagram structure for organizing and displaying instrument/facility metadata more broadly? A datagram would nicely format the instrument content of the tables and also the images from Figures 2, 3 and 5. (Morris, S., & Uttal, T. (2022). Datagrams: Diagrammatic Metadata for Humans, Bulletin of the American Meteorological Society, 103(5), E1343-E1350. Retrieved Jun 23, 2022, from https://journals.ametsoc.org/view/journals/bams/103/5/BAMS-D-21-0219.1.xml)
- Thank you for sharing the paper; the authors found this paper to be an interesting read and great way to visualize datasets. We will adopt this approach in the future and begin compiling example observations (G) and the required output file details (H) to create datagrams for all the instruments at the sites.
- Have you considered putting the data from these two super-sites into the IASOA data portal? (https://psl.noaa.gov/iasoa/home2)
- This would be possible; I have contacted IASOA to learn more about how to store the data on their portal.
- I see that you have installed some cameras onsite to determine weather conditions and to visually check the instruments. Do the resolution on the cameras allow for assessment of the radiometers and whether they have snow/ice on the domes. I see that you have installed an icing detector near the radiometer suite, but images might also be helpful. Also, are the images available in a dataset somewhere as well? Might be a nice additional dataset to assess how well instruments are holding up against the Arctic conditions and maintaining clear optical lenses/domes.
- Yes, the cameras are 4K (high resolution) and close enough to visually confirm the presence/absence of snow/ice/frost/fog on the domes, as well as several other instruments (e.g., lidar optical windows). These domes/optics are heated, however, so the presence of ice/snow was short-lived. The images were not included in the Open Data Portal archive, but they can be in the future; I will begin the process of submitting the images to the Portal. The text in Sect. 3.1. regarding the cameras has been updated to reflect this.
- Do site techs ever visit the sites to clean the instruments (radiometers) if needed? Is there a set schedule or if there an identifier in the data (or cameras) that initiates a site visit?
- Technicians visit the site about twice a year to perform maintenance on all the instruments, such as cleaning the radiometers, etc. (not during Covid). Since the schedule is highly variable, an identifier in the published data will be included to mark these service visits.
- Have you looked at the D-ICE experiment publication, with respect to maintaining clear domes from ice/snow/rime/etc. for the radiometers? (Cox, C. J., Morris, S. M., Uttal, T., Burgener, R., Hall, E., Kutchenreiter, M., . . . Wendell, J. (2021). The De-Icing Comparison Experiment (D-ICE): a study of broadband radiometric measurements under icing conditions in the Arctic. Meas. Tech., 14(2), 1205-1224. doi:10.5194/amt-14-1205-2021)
- Where possible, attempts were made to adopt BSRN standards and practices. CVF4 ventilators were used for each dome, operating at all times when icing conditions were present. However, a detailed analysis using the 4K cameras to confirm the absence of ice/snow/frost has not been conducted, as it was with the Cox et al. paper. As such, data warnings are included in the published dataset cautioning the user about the possible presence of ice/snow on the domes; these warnings have been added to the paper in Sect. 3.1.4 to articulate this potential issue and a reference to the Cox paper has been included.
- Do you have a rotation schedule for instrument calibrations? I understand that techs are not on-site, but how often would someone go out to verify measurement accuracy (like with cleaning radiometer domes, etc.).
- Calibration and maintenance schedule will be included in the published dataset (see earlier comment). Site visits were roughly two times per year.
- Have you considered installing a tracker near the radiation suite? They do need much more maintenance and attention, so might be difficult considering these are automated sites.
- Yes; a tracker would have been excellent to include. Unfortunately it was deemed unfeasible due to having no operators present at the site (automation reliability was questionable). It may be possible to deploy one in the future.
- Is there any discussion about including ground heat flux plates or thermistor measurements at the sites? This might be a nice addition to round out surface energy budget terms, like conductive heat flux. Similarly, turbulent fluxes would also be a nice addition (though these would require more power and would transfer data at a much faster rate, so infrastructure might not allow for this).
- Yes, there are ideas to expand the capabilities of the sites’ surface energy budget observations, bringing the site closer in-line with BSRN, for instance. Unfortunately there are no firm plans to initiate this in the near future. If our budget allows for an expansion of the site, these instruments would be a priority to include. The addition of turbulence fluxes would be possible given the current infrastructure at the Iqaluit site. Data transfer would remain local (except for post-processed products sent to obrs.ca) since bandwidth cannot be increased in a meaningful capacity.
- Have the quality control or post-processing techniques been posted in any reports that you could reference? Are you doing any post-processing or quality control efforts to the data before it gets posted to the portal or is the data mostly raw?
- Several levels of data processing are made available via the Open Data Portal. There is raw (level 0) data with no quality control imposed for all instruments, enabling the user to impose their own QC algorithms. We have also published the processed data sets (for a limited number of instruments; e.g., lidar VAD wind profiles) as flat text files as well as all of the processed product images (.jpgs). For the processed products, notes in the published readme files point to the type of QC algorithms applied and whom to contact to obtain processing codes, QC algorithms, or more information. Unfortunately, much of this has not been published in external reports that can be referenced, with the exception of a few QC standards already referenced in this paper (e.g., lidar SNR threshold, etc.). A new Section (3.3) has been added to this paper that describes these data storage rules and identifiers.
- In Figure 1 the orange squares are difficult to decipher/locate on the maps. Including overlaid arrows on the maps with short captions/acronyms could help to better determine the locations. Spell out SGP in
- Figure 1 has been updated to more clearly indicate the locations on the map using red squares.
Response to Reviewer 2
- The introduction section should be re-organized. The reasons why the observation data was needed are not presented. Besides, the authors did not summarize any previous works on the meteorological observations at the Canadian Arctic weather science. And the necessity of both sites and their representativeness need to be clearly stated.
- The introduction has been re-organized, as suggested. A summary of the previous work on meteorological observations in the Canadian Arctic is provided and referenced (Joe et al.) on line 59 (as well as other references listed in the second paragraph). A new reference to meteorological observations conducted at the Eureka, NU research site (e.g., PEARL climate site) have been included on line 55: “The Canadian Network for the Detection of Climate Change research site at Eureka, NU, (80.05oN, 86.42oW) is equipped with remote sensing meteorological and climate observations (e.g., Lesins et al., 2009).” The need for higher spatial and temporal-resolution measurements is discussed in the third and fourth paragraphs; this discussion has been expanded to explicitly describe the reasons why particular observations were needed: “The new profiling observations of winds and water vapour, for instance, are crucial to determine fluxes of water vapour transport, the presence of atmospheric rivers, and hazardous wind conditions for aviation. Such profiling observations do not currently exist in the Arctic (except for standard radiosondes every 12 hr); as such these profile observations provide novel data useful for satellite calibration/validation, evaluating and improving NWP model performance above the surface layer, HIW classification (e.g., depth and height of blowing snow during a blizzard), and for cloud microphysics studies.” A sentence has been added to describe the necessity of both sites and their representativeness on line 71: “The two sites are representative of their regions and provide contrasting conditions: e.g., Western vs. Eastern Arctic, mountainous vs. tundra, and inland valley vs. marine.”
- Figure 1 should be clearer. The current remote sensing images are too large in scope, resulting in key information (meteorological station locations) not being highlighted enough. I suggest the authors to use red square instead of orange square. In addition, it would be better for the study area overview map to include latitude and longitude information.
- Figure 1 has been modified to more clearly indicate the locations on the map using red squares, as suggested. Red circles were also added to highlight the station locations. Lat/lon labels were added, but a lat/lon grid could not be added to the Google map since it is not our image and the 3D to 2D projection would not align well.
- In Section 2: The author should describe the underlying surface conditions of the two superstations separately.
- The terrain and topography for both sites are described in Sect. 2 separately. The typical surface weather conditions and synoptic storms are also described separately on lines 123-129 and lines 140-145. Details on the underlying surface conditions have been added to the text: “permafrost terrain (rock / soil),” and separately for Whitehorse: “wooden platform, all within a few metres of each other above compact gravel.”
- A separate section should be added to the manuscript to describe the rules for data storage, such as the way to note different levels of data quality and missing values in the data storage file (Boiker et al., 2018).
- A new Section was added to the manuscript to address this (Sect. 3.3): “All geophysical variables observed at the Iqaluit and Whitehorse sites were archived as raw data files and processed in the same manner. Several levels of data processing were published; raw (level 0) data with no quality control (QC) imposed was made available for all instruments, enabling the user to impose their own QC algorithms. As such, all raw data files should be treated with caution, particularly for the radiation flux observations which typically require additional QC processing prior to analysis. Processed data sets (for a limited number of instruments; e.g., lidar VAD wind profiles) are also available as flat text files as well as all of the processed product images (.jpgs). For the processed products, notes in the published readme files point to the type of QC algorithms applied and whom to contact to obtain processing codes, QC algorithms, or more information in general. In all cases, time is reported as UTC and heights are a.g.l. When no data was available (due to the instrument being down or loss of power at the site), gaps exist or the value -9999 was used. When instruments were maintained or recalibrated by technicians visiting the site (roughly twice a year), an identifier in the published metadata is included to mark these service visits.”
- The context includes too much content on the observation instruments while the accuracy, quality and duration of the observations is not clear.
- Throughout Sect. 3, wherever possible, additional information regarding the instrument’s accuracy, precision, and duration of observations was provided and made clearer. Specific downtime periods and gaps are now provided in more detail, where applicable. Additional references to Tables 1 and 2 were provided to help point the reader to the final column (accuracy) when applicable. Additions to the ‘accuracy’ column in table 1 were made, particularly for the information on the radars.
- The author introduced only short period in sample of Meteorological Data during High-Impact Weather Events, which were Iqaluit blizzard: November 23 2018 and Whitehorse blizzard: Dec 16-17 2019. If the continuous observation for both sites especially from 2018 to 2021 could be presented, the manuscript will be expected to receive greater concerns. It can be seen from Tables 1 and 2 that various monitoring data have time series of more than 3 years, but the Figures in the manuscript select a certain period when displaying the time series of various monitoring data. Authors are advised to present a complete presentation of the monitoring data. In addition, did the authors find some interesting patterns of variation based on the time series of the various monitoring data? If so, please present them appropriately.
- Figure 4 originally provided meteorological data (surface) from 2016-2019, encompassing three years. We have extended this timeline to include more data, as suggested. The new version of the Figure now includes data from 2015 up to 2021, over 5 years worth of data. We have also added an additional observational dataset to the Figure, showing remote sensing profile observations (water vapour profile observations) from the DIAL. The Figure now provides a more complete view of surface and upper air meteorological conditions and variability at the site throughout the study period, and visually identifying outliers / times of HIW is more apparent given the combination of precipitation rate, amount, and water vapour profile concentrations. Similarly, Figure 6 originally provided surface meteorological data in 2018 for Whitehorse; it has now been extended to show data up to 202207, almost encompassing the entire dataset collected at Whitehorse. Typical patterns at both sites were observed both in the short-term (changes due to cloud cover) and the long-term (seasonality), as described in the case studies (e.g., change in radiation with changing clouds) and as can be observed in Figures 4 and 6 (seasonal cycles). Investigations into other patterns of variation based on the meteorological data is the focus of future scientific research and will be presented in upcoming science publications.
- I don’t understand the classification method for the light precipitation, moderate precipitation and heavy precipitation, please add this definition. How to obtain credible precipitation type? Such as rain, snow, sleet. In general, different discriminating method have different credibility in certain region, the method in this manuscript should be explained clearly. In addition, the precipitation type that mixed was used in some figures, what are the differences between sleet and mixed?
- A description of the FS11P’s precipitation classification (light/moderate/heavy) and precipitation type classification (including an explanation of the methodology and algorithm) is now included as a new paragraph in Sect. 3.1.2: “The PWD52 (and FS11P used in Whitehorse) meet Federal Aviation Administration and International Civil Aviation Organization specifications. Precipitation type and intensity are estimated on the basis of an optical principle via the attenuation of a laser beam by falling particles. The precipitation type can be estimated by using empirical relationships between the observed diameter and fall speed of the particles (Gunn and Kinzer, 1949). Default settings for the precipitation intensity limits define the light (<2 mm/hr), moderate (2-8 mm/hr), and heavy (> 8 mm/hr) precipitation flags reported in the data (different thresholds are used for snow). The precipitation classification algorithm is proprietary to the manufacturer (Vaisala) and was used without modification.” Precipitation type reporting depends on the weather type reporting format: WMO 4680 (SYNOO), 4678 (METAR), and NWS code tables are all possible and have different precipitation classification schemes. In Figures 4 and 6, mixed precipitation classification represents some other type of precipitation that isn’t considered rain/snow, such as sleet. This is now clarified in the Figure caption: “Note that mixed precipitation type represents precipitation that is not rain or snow (e.g., freezing rain, sleet, etc.).”
- Are the thermistors shifting during the monitoring period? Are they calibrated every year or at a constant frequency in lab? Modern sensors and transmitters are electronic devices, and the reference voltage, or signal, may drift over time due to temperature, pressure, or change in ambient conditions.
- Calibration drift is a concern, particularly for long-term climate observations. The thermistors undergo maintenance during every service visit, about twice a year (except during Covid). This detail has been added to the paper in lines 437-440: “When instruments were maintained and/or recalibrated by technicians visiting the site (roughly twice a year), …” Certain instruments, such as the radiation fluxes, ceilometers, lidars, and radars are recalibrated during service visits as well to mitigate this issue. Fortunately, given the shorter timespan of operation at these sites (compared to decades-long climate sites), measurement drift is likely minimal and further mitigated via maintenance.
- The text in all the diagrams in the manuscript is very small and not conducive to reading.
- The text in all Figures has been increased, where possible, to improve clarity.
Citation: https://doi.org/10.5194/essd-2022-174-AC1 - Is there any hope of continuing these measurements beyond the identified June 2022 for Whitehorse, and partial continuation at Iqaluit?