Improved BEC SMOS Arctic Sea Surface Salinity product v3.1

. Measuring salinity from space is challenging since the sensitivity of the brightness temperature ( TB ) to sea surface salinity (SSS) is low (about 0 . 5 K/psu ), while the SSS range in the open ocean is narrow (about 5 psu, if river discharge areas are not considered). This translates into a high accuracy requirement of the radiometer (about 2-3 K). Moreover, the sensitivity of the TB to SSS at cold waters is even lower ( 0 . 3 K/psu ), making the retrieval of the SSS in the cold waters even more challenging. Due to this limitation, ESA launched a speciﬁc initiative in 2019, the Arctic+Salinity project (AO/1- 5 9158/18/I-BG), to produce an enhanced Arctic SSS product with better quality and resolution than the available products. This paper presents the methodologies used to produce the new enhanced Arctic SMOS SSS product (Martínez et al. , 2020) . The product consists of 9-day averaged maps in an EASE 2.0 grid


Introduction
Changes in the Arctic Ocean freshwater distribution may be linked to changes in the thermohaline circulation, which in turn may have implications for the global climate (Manabe, 1995). Thus, it is critical to understand the mechanisms of freshwater exchanges between the Arctic and the global ocean. 15 However, the number of in situ surface salinity measurements is, therefore, very scarce, and especially in the central Arctic Ocean, since it is a region with extreme weather conditions and sea ice forces are strong enough to destroy the in situ measurements infrastructures (like Argo floats, moorings, or gliders).
The use of L-band radiometry to fill the observational salinity gaps at high latitudes could be very useful to better monitoring the observed changes in the freshwater fluxes (Fournier et al. (2020)). In 2009, the ESA SMOS (Soil Moisture Ocean Salinity) satellite mission was launched . It was the first satellite carrying an L-band radiometer enabling the measurement of the ocean sea surface salinity.
The SMOS frequency band (1.43GHz, L-band) is an optimum band to measure salinity, since this electromagnetic region is protected against human electromagnetic emissions, while the sensitivity to salinity is high.
The SMOS standard SSS retrieval algorithm Mecklenburg et al., 2009;Olmedo et al., 2021), as well as the 25 algorithms used for SSS retrieval from Aquarius and SMAP data (Tang et al., 2017(Tang et al., , 2020, provide in general good estimates of SSS in the open ocean and within the tropical and mid-latitudes (Reul et al., 2020).
However, SSS retrievals from the current operating L-band radiometer satellites present serious problems at high latitudes: -Low sensitivity of Brightness Temperatures (TB) to salinity in cold waters: Whilst the sensitivity to salinity us high at L-band, the sensitivity decreases rapidly in cold waters. As shown in Yueh et al. (2001), such sensitivity drops from 0.5 30 to 0.3 K/psu, when Sea Surface Temperature (SST) decreases from 15 to 5 • C. Therefore, the errors of the SSS in cold waters are larger than in temperate seas.
-Land-sea contamination (LSC) and ice-sea contamination (ISC): Sharp transititons of T B values between land and sea, or ice and sea, induce contamination of the signal which is especially important (both in amplitude and spatial range) in the case of SMOS, due to the large footprint on the ground. Despite this instrumental characteristic, it is also 35 present in SMAP and its predecessor, Aquarius. In the case of SMOS, this type of contamination has an impact on ocean observations very far from the coast and the ice.
-Lack of in situ measurements: The limited number of in situ measurements of SSS in the Arctic is a major imitations, for the validation, since measurements are not evenly distributed, so that some regions have a clear lack of them.
In the framework of the ESA project Arctic+Salinity (AO/1-9158/18/I-BG), 9 years (2011-2019) of an enhanced SMOS 40 SSS product have been produced. BEC distributes Level 2 maps and Level 3 maps of 3, 9, and 18 days in an EASE 2.0 grid of 25 km. The major changes in the algorithms have been focused on improving the effective spatial and temporal resolution of the product, allowing better monitoring of the mesoscale structures and the river discharges. The algorithms used for the generation of this new product are detailed in the Algorithm Theoretical Baseline Document (ATBD) of the Arctic+Salinity product . 45 This paper describes the data sets (section 1.1) used for the generation of the product and its validation, the algorithms developed to generate the new product (section 2) and the validation performed to assess the quality of the product, namely: i) comparison with in situ measurements, ii) spectral analysis, and iii) error characterization by using triple collocation with SMAP data (section 3).

SMOS Brightness Temperatures
The computation of T B starts from the ESA Level 1B v620 product. This data set is is freely available with prior registration at: https://earth.esa.int/eogateway/missions/smos/data. L1B product contains T B Fourier components arranged in a time-ordered way according to the integration time.
1.1.2 Auxiliary data used in the salinity retrieval 55 Geophysical variables from the European Centre for Medium-Range Weather Forecasts (ECMWF) The auxiliary information is provided by ECMWF (Sabater & De Rosnay, 2010) collocated in time and space with each SMOS orbit. The data is provided in the Icosahedral Snyder Equal Area 4H9 (ISEA 4H9) grid (Matos et al., 2004). We use a nearest-neighbor interpolation to get the values in the custom T B grid.

60
Sea ice concentration (SIC) product from OSISAF is used to discard the grid points contaminated by sea ice. We use SIC product version OSI-450 and OSI-430-b (Lavergne et al., 2019) developed and processed in the context of the OSISAF (http://www.osi-saf.org/)) of the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) and the Climate Change Initiative (CCI) Programme of the European Space Agency (ESA). The grid of these products is the EASE-Grid 2.0 with a spatial resolution of 25 km.

Annual SSS and SST climatology
The World Ocean Atlas 2018 (WOA 2018 -A5B7) annual SSS and SST climatologies at 0.25 • × 0.25 • spatial resolution (Zweng et al., 2018) are used in the inversion algorithm. The SSS and SST are used as inputs of the Meissner and Wentz dielectric constant model to obtain T B (Meissner & Wentz, 2004. The T B obtained value is considered as the reference value to perform the spatial bias correction of the measured T B. WOA 2018 -A5B7 (generated from measurements of the 70 2005-2017 period) has been used to be consistent as much as possible with the SMOS life period.

Global Ocean Forecasting System
The Global Ocean Forecasting System (GOFS) 3.1 (HYbrid Coordinate Ocean Model (HYCOM) + Navy Coupled Ocean Data Assimilation (NCODA)) sea surface salinity product is used as the reference to perform the temporal correction. The spatial resolution of this product is 0.08 • in longitude and 0.04 • in latitude for polar regions and corresponds to the GLBv0.08 grid 75 (Cummings, 2005;Cummings & Smedstad, 2013). The data can be downloaded from https://www.hycom.org/data/glbv0pt08.

ARGO dataset
ARGO floats data (Argo (2018)) is commonly used for the validation of SSS from SMOS and SMAP data (Tang et al., 2017;Olmedo et al., 2021). However, this is complicated in the Arctic region. ARGO data is very scarce and Argo profilers are concentrated in the Atlantic region (due to bathymetry and geographical restriction of ocean circulation), providing a biased 80 sample of the mean SSS in the Arctic. There is a lack of Argo profilers in the Bering, Beaufort, East Siberian, Laptev, Kara, Barents, and North seas and also in the Hudson and Baffin bays (see figure 7).
The following quality control over the values of Argo SSS is used: The cut-off depth for Argo profiles is taken between 5 and 10 m. Profiles included in the grey list (i.e., floats which may have problems with one or more sensors) are discarded. Argo float profiles with anomalies larger than 10 • C in temperature or 5 psu in salinity when compared to WOA 2013 are discarded.

85
Finally, only profiles having a temperature between -2.5 and 40 • C and salinity between 2 and 41 psu close to the surface are used. The generation of the improved and higher spatial resolution salinity maps has followed several steps which are described below. A scheme of the whole algorithm is shown in figure 1 (for more details, the reader is referred to the ATBD of the Arctic+ project ).  (Snyder, 1987).

Geolocation and projection of the Brightness Temperatures
As in the standard procedure Anterrieu et al. (2002), we apply a Blackman window to the Fourier components in order to reduce the Gibbs-like contamination. The TB is obtained by applying an Inverse Fourier Transformation to the resulting TB computed and the measured T B (ξ i , η i ) are known in all 64 × 64 FOV points, the Earth grid is generated and the points of this grid are retroprojected up to SMOS antenna coordinate reference system for each SMOS snapshot. In this v3.1 product, we have computed the TBs directly in the final grid of the salinity map products. This choice has been made to keep the maximum information of the salinity gradients without losing resolution due to interpolation errors. The galactic and sun glint contributions are computed following the models described in Tenerelli et al. (2008) and Reul et al. 120 (2007), respectively. We use the roughness model developed by BEC ( (Guimbard et al., 2012)). The auxiliary data provided by ECMWF is collocated with SMOS measures and used to evaluate the models. Finally, we compute the T B corresponding to the flat sea contribution (T B H f lat and T B V f lat ) by subtracting the rest of the contributions. The polarizations of the SMOS TBs are affected by the ionosphere, and they can be corrected following Zine et al. (2008). Nevertheless, the ionosphere produces a rotation between the T B polarizations leaving unaltered the first Stokes parameter (I = T B x + T B y ), parameter used to 125 perform the T B inversion.

Debiasing
The debiased non-Bayesian retrieval method was introduced in Olmedo et al. (2017) to retrieve salinity from SMOS T B.
Salinity retrieval is performed by means of a minimization of the difference between the measured first Stokes parameter and the modelled one. This minimization follows a non-Bayesian scheme i.e., a SSS value is retrieved for each T B measurement.

130
The new salinity Arctic product also retrieves salinity using a non-Bayesian scheme but introducing important changes.
The SSS debiasing approach employed in the previous version of the BEC Arctic SMOS SSS product (v2.0, (Olmedo et al., 2018)) started from a long-term time series (8 years) of SSS retrievals. These SSS retrievals were grouped according to their geographical location, incidence angle, distance to the center of the swath (across-track distance), and the satellite overpass direction. These salinity values were accumulated in classes for each group, obtaining the discrete salinity distribution function 135 for each one. The characteristic value of these distributions can be considered as the SMOS-based salinity climatology. The mean around the mode of each distribution was chosen as its characteristic value. To mitigate the local time-independent biases, the corresponding SMOS climatology is subtracted from each measured SSS value and replaced by an annual SSS reference.
World Atlas Ocean 2013 (WOA) (Zweng et al., 2013) was the reference of choice for v2.0. This method mitigates biases like those caused by land-sea contamination or permanent Radio Frequency Interferences (RFI) sources.

140
In this work, we aim at improving the algorithm in two specific points. First, the non-homogeneous division of the antenna FOV into groups of incidence angles and across-track distance derives into different statistic representativeness for different points of the antenna, providing a non-optimal resolution of the final climatological salinity product. This has been improved by introducing an homogeneous discretization of the Extended Alias Free Field of View (EAF-FOV) in ξη coordinates. Secondly, the non-linearity of the L-band dielectric models at very low salinity ranges amplifies the errors of the retrievals at low salinity 145 values. Since the retrieval procedure propagates systematic errors from T B value to the resulting SSS, the debiasing is applied at T B level and not at SSS level, so to mitigate as much as possible these effects (more details can be found in Martínez et al. (2020)).
The interferometric nature of SMOS divides the SMOS antenna FOV into a hexagonal grid. Accordingly, the antenna FOV has been homogeneously divided into hexagonal cells that cover the same antenna area. To ensure that a number of mea-150 surements large enough are accumulated, by each hexagonal cell, to compute a reliable SMOS-based climatology, each cell contains 7 points of the original 64 × 64 FOV grid.
To perform the spatial bias correction, the measured first Stokes parameter is grouped into discrete distributions according to the antenna cell in which it has been acquired, its geographical location on the Earth, and the satellite flight direction. Thereby, the SMOS-based climatology is subtracted from the individual measures of the first Stokes parameter, and the annual reference is added to it. No brightness temperature exists from World Ocean Atlas, so it is necessary to compute it starting from WOA 2018 SSS and SST (Zweng et al., 2018) using the Meissner and Wentz dielectric model.

SMOS-based climatology
The SMOS-based climatology has been computed using the SMOS T B data from the 2013-2019 period (both included). We discarded the years 2011 and 2012 because of the strong affectation of RFI in the earlier years of the mission.

160
The SMOS-based climatology is performed by computing the distribution of the first stokes separately for ascending and descending orbits. The histograms are created by accumulating valid measures in bins of 1 K for each 25 km EASE-Grid 2.0 North grid point (x and y coordinates) and FOV coordinates (ξ and η coordinates). Only latitudes beyond 50 • N are considered.
We apply the following filtering criteria over I before computing the climatology: -Only measurements in the range 75 K < I meas f lat < 165 K are considered 165 -Measurements acquired with ECMWF SIC values larger than 0.05 are discarded.
-The Tukey (1977) rule is used to detect outliers. Then, measures ∆ accomplishing one of these conditions: are considered outliers. Here IQR = Q 3 − Q 1 and Q 1 and Q 3 are the first and third quartile respectively. Outliers detection is implemented in two stages: -We compute the linear regression from all the T B (θ) measures for a given geographical point obtained at different incidence angles (θ) in a given orbit. Outliers are detected by computing the difference between this linear 170 regression and each individual measurement (∆).
-The rule is also applied for each I meas f lat (x, y, ξ, η) distribution.
Once the valid measurements are selected, for each acquisition condition γ = (x, y, ξ, η, d), with d the orbit direction, we The histograms are created adding only valid measurements in bins of 1 K for each 25 km EASE-Grid 2.0 North grid point (x and y coordinates) 175 including the measures collected in a 75×75 km square. The measurements are grouped for each FOV cell composed of seven unshared ξ and η coordinates. Then, for each γ, we compute the following statistical parameters: frequency, mean, median, interquartile range, and the second, third, and fourth central moments. The representative value used afterwards for the debiasing, namely the SMOS-based climatology I c γ is assumed, as in the previous version of the debiasing method, as the mean around one standard deviation from the mode of the I f lat (γ).

Inversion
Once the systematic errors of the measured flat sea I meas f lat are corrected, the SSS retrieval can be performed by using a dielectric constant model. The flat sea emissivity is described by Fresnel reflection law that is a function of the incidence angle of the radiation θ and the dielectric coefficient ε, which depends on the SST, the frequency, and the conductivity, which in turn depends on the salinity.

185
In the L-band range, the more common dielectric constant models used are Klein and Swift model (KS) (Klein & Swift, 1977) and Meissner and Wentz model (MW) (Meissner & Wentz, 2004. These dielectric models are based on a Debye relaxation law Debye (1929) with a conductivity term. The MW model interpolates the dielectric constant as a function of salinity between 0 and 40 psu and provides values for the ocean surface temperature between −2 • C and 29 • C. The KS model was adjusted using a discrete set of measures at 5 • C, 10 • C, 20 • C and 30 • C and it is reported by authors to be valid in the range 190 of 4-35 psu. However, KS model has a very problematic behaviour below 5 • C (Zhou et al., 2017;Dinnat et al., 2019), so we have used the MW model to derive the high latitudes SSS. It has been reported that the SST-dependent bias in the retrieved SSS in cold waters could be mainly due to deficiencies in the sea water dielectric constant models (Dinnat et al., 2019). The MW model is not free from this pathology but the large errors that KS model produces below 5 • C reinforces the choice of MW as the sea water dielectric model. 195 We obtain the salinity value by minimizing the following cost function: (1) with I mod f lat the first Stokes described by the models. We use the Newton-Raphson method (Press et al., 1992) to find the SSS value that minimizes the equation F . 200 We apply the following filtering criteria before the inversion:

T B filtering approach
-Values of I meas f lat too close to the belts and suspenders (closer than 0.025 antenna units) are discarded (see figure 2).
-Points affected by Sun tails or reflected Sun circle are also discarded.
-The I meas f lat values considered as outliers during the SMOS-based climatology computation (section 2.3.1) are discarded as well.

205
Additionally, T B values are discarded if the SMOS-based climatology used for its bias correction is considered as a moderately non-normal distribution (kurtosis and skewness conditions according to West et al. (1995)), if climatology has been computed using a low number of measures or if the standard deviation of the climatological distribution is too high close to the coast (suspicion of residual land-sea or persistent RFI contamination). These conditions are summarized in the following rules:

210
-The minimum number of measures to create the SMOS-based climatology must be 100.
-The absolute value of kurtosis must be less than 7. -The absolute value of skewness must be less than 2.
-For points located less than 100 km from the coast, only those having a standard deviation less than 8 K are taken into account.

215
To ensure the minimum ice-sea contamination all points having SIC>0 according to Sea Ice Climate Change Initiative product OSI-450 and OSI-430-b (Lavergne et al., 2019) are not included in the minimization process. The distance to the ice edge (defined by the line SIC=0) is also stored with the same purpose (minimizing the ice-sea contamination by avoiding the points too close to the ice in the L3 map generation).

Minimization and error propagation 220
The I meas f lat accepted after the filtering process are debiased by extracting the SMOS climatologic value and by adding the WOA derived climatological value and introduced in the minimization function. We apply the following convergence criteria in the iterative scheme: -The change in salinity values between two consecutive iterations is less than 0.001.
-The percentage of variation in the cost function between consecutive steps is less than 1.

225
-The above two conditions are accomplished during 5 consecutive iterations to avoid oscillatory solutions.
-The above condition is accomplished in less than 150 iterations.
The propagation of the T B radiometric error to salinity is made by performing the minimization of two additional quantities: here σ [H,V ] is the radiometric accuracy of T B [H,V ] f lat . Then, the salinity error is computed from the following equation: where SSS meas+ and SSS meas− are obtained by the inversion of I meas+ f lat and I meas− f lat , respectively.

SSS filtering approach
Once each orbit has been processed, we perform the inversion using the mentioned MW dielectric model to create orbits that contain one salinity value for each measured T B. At this level, we obtain for each location as many salinity values as 235 measurements have been inverted. Thus, the salinity is a function of the incidence angle. To create this product we apply the following filtering criteria: -SSS(x, y, ξ, η, d) < 0 and SSS(x, y, ξ, η, d)> 50 are discarded.
-In the case of SSS(x, y, ξ, η, d) < 25, we discard the retrieval when |SSS(x, y, ξ, η, d) − SSS woa2018 | > 21. We relax 240 the criteria in this case to better capture the river discharges and melting episodes which are not well described in WOA 2018.

Generation of SSS satellite overpasses
Prior to the L3 maps generation, we create the L2B product. The L2B orbits provide salinity values independent from the antenna point acquisition and are generated from L2A snapshots by weighted averaging all the measures obtained for a given 245 grid point. Outliers detection is performed in a similar way as it was applied to TB: a linear regression is performed from all the incidence angle sorted SSS values of the same orbit obtained for a given geographical point. The outliers are detected, according to Tukey (1977) rule, from the difference between this linear regression and each individual measure. L2B SSS values are only computed for those grid points containing more than 12 unfiltered L2A retrieved SSS values.
Assuming the weight function as the inverse of the squared error of each L2A SSS measure, we ensure that the measures 250 coming from T B having a high radiometric error will have a small influence in the obtained value for SSS at L2B level.

Temporal Bias correction
The T B debiasing procedure only accounts for the spatial biases, so that a temporal correction is also required. The correction 260 implemented in the previous version operates over L3 maps and is based on Argo profiles. The median of the differences between the collocated L3 SSS and the Argo available for the 9-day period of each map is removed to the corresponding 9-day SSS map. Since we also need to perform a time correction on L2 SSS orbits, and the amount of ARGO data available for each orbit is insufficient, we propose to perform the temporal bias corrections using the the Global Ocean Forecasting System (GOFS) 3.1 (HYCOM + NCODA) as reference. The correction is based on the iterative scheme presented in figure 3 and 265 detailed as follow: - Step 1: We start subtracting 12 psu to all SSS (gridpoint level) retrievals. This intends to reduce the number of iterations in the loop and to improve the convergence. The value of 12 psu has been obtained as the better one after several retrieval processings. - Step 2: We apply the filtering criteria described in section 2.4.3 and we average the corresponding filtered SSS retrievals 270 in each grid point to generate the satellite overpass (as described in section 2.5) - Step 3: We update the temporal correction value with the mean difference between each value of the SSS (at orbital level) and the collocated HYCOM salinity. - Step 4: We add this temporal correction to each SSS(x, y, ξ, η, d) at snapshot level.
-We repeat Step 2-4 until the difference between two consecutive corrections is lower than 0.01 psu.

275
Only orbits providing at least 50 common grid points with HYCOM are considered. Due to the fact that HYCOM provides too salty values in the river mouths, only grid points having a retrieved salinity value above 25 psu and an error below 2.5 psu are considered to compute the temporal correction. Figure 3. Scheme of the iterative procedure used to correct the temporal salinity bias on level 2.

Correction of the residual spatial bias
As it has been described in section 2.3, the debiasing method is based on the substitution of the SMOS-based T B climatology 280 by the SSS reference from the WOA 2018 (Zweng et al., 2018). With that method, a first-order spatial correction is performed.
After the debiasing method, the average salinity obtained for the period used to compute the SMOS-based climatology ( In order to mitigate this residual spatial bias, we compute an anomaly spatial map by applying the following algorithm to all the L2B orbits in the period 2013-2019 (shown in figure 4):

300
-We compute the difference between SMOS overpasses salinities and the ones provided by WOA 2018.
-We generate two anomaly files: one for ascending and one for descending passes (they show differences of up to +/-1.2 psu).
After the creation of these two anomaly maps, every L2B salinity value is corrected by subtracting this spatial anomaly. The salinity values corresponding to geographical points where the anomaly was computed using less than 50 collocations are 305 discarded.

Generation of L3 SSS maps
L3 maps are daily generated for 9-days periods. Each map is obtained by a weighted average of all the SSS in all the overpasses of the 9-day period. Each L2B salinity value is weighted according to its salinity error as described in section 2.4.2. To

Quality Assessment
The validation of the product is done using different in situ measurements, the results are described in detail in the Product Validation Report (Arias et al., 2020) that can be found on the project web page (https://arcticsalinity.argans.co.uk/documentation/).
We recall some important aspects of this product and its validation procedure: -The validation in the Arctic region is rather complex due to the heterogeneity of the in situ datasets, with a lack of 320 temporal or spatial synopticity. Some of the sources only represent specific regions, with the risk of inducing bias by spatial selection when assessing the product entirely. Others cover a larger spatial representation, however, a lack of a proper temporal variability still remains. None of the datasets used can describe both aspects simultaneously, thus the validation requires an exhaustive analysis of the results to assess the quality of the product.
-Arctic+ SSS v3.1 product introduces an improvement in the number of SSS retrievals obtained from SMOS T B for the -The new product benefits from a polar grid in EASE v2.0 format, which is a standard for the research and operations in the region, improving its usability.
-Arctic SSS+ v3.1 product has been built only using WOA 2018 and HYCOM model output, without using ARGO data. -The validation of the product using ARGO floats in the Arctic is only valid for Greenland and Norwegian seas regions,

330
where the Argo floats are present. However, a comparison with punctual measurements can not be used to evaluate the improved data coverage nor the improved spatial resolution. Thus, the ARGO analysis can not be used alone to describe the quality of this product.
-It is also important to highlight that, when inter-comparing satellite-based SSS products, there is a need to focus on the selected projections and grid. A fair set of metrics for inter-comparison is only possible over common points resolved at 335 the same spatial scales. This means that the quality control for these products requires setting the products into the same spatial grids and projections. By not doing so, significant errors may be introduced artificially in the metrics, products may be penalized because of differences in the sampling strategies and thus, the match-up databases do not yield the same points and hence, information.

340
Even though the ARGO floats are very scarce in the Arctic and are mainly in the Atlantic and Pacific regions (see figure 7), we have used Argo SSS data to assess the biases and the standard deviations of the errors of the new SSS products, but with the caution that this analysis is only valid for the region where Argo floats are present, and assuming that Argo values represent a ground truth for the whole pixel.
The methodology followed to perform the temporal and spatial collocation between the Argo SSS with the SSS maps is the 345 following: For a given in situ point, the closest satellite point is searched both in time and in space, with a radius of 25 km from the in situ measurement and a maximum period of 9-days off in time. This strategy leads to some repetition in the use of the in situ data points for different maps, but never over the same daily product. This has been deemed as the most solid strategy, as it maximizes the quantity of in situ information to validate the satellite products. RM SD = 0.29 and correlation R = 0.97. The standard deviation is larger than the previous BEC v2.0 product, but this is expected since the BEC Arctic v2.0 product was generated by performing objective analysis with correlation radii 321 km, 267 km, and 175 km (Olmedo et al., 2016). The large correlation radii produce a large smoothing effect, reducing the noise.

355
However, the smoothing results in a reduction of the spatial resolution, which is significantly improved in the v3.1 product (see section 3.3). BEC v3.1 product, thus, contains more dynamic information than v2.0.
The degraded quality of data in the first two years (2011 and 2012) are due to a high RFI source located in Greenland which highly contaminated the measurements and those could not be corrected during the processing step, so we recommend the users to avoid those tow years data. Latter on, by end of 2012, the main source of RFI was locked down, permitting to have 360 higher quality measurements (Oliva et al., 2016).

Comparison with TARA dataset
TARA salinity data presents a large range in the spatial variability of salinity between 26 and 35 psu in the Arctic Ocean (see figure 9. The mean, the standard deviation (STDD), root mean square (RSMD), and correlation (R) are computed for all the residu-365 als of the collocated points between TARA and the Arctic+ v3.1 (Table 1). To assess the quality of the Arctic+ v3.1 product   Report, (Arias et al., 2020)).
Nevertheless, metrics with TARA are not simple to interpret due to the lack of synopticity into the dataset (acquired over a relatively long period of time, i.e., representing different observational conditions) and the lack of spatial homogeneity of the sampling, which explains the relatively large variability observed in the metrics for different sub-basins.

Error characterization by Correlated Triple collocation
Triple Collocation (TC) is a method originally introduced by Stoffelen (1998) to provide estimates of the measurement error variances of three systems measuring the same variable at the same time. TC is based on the statistical relations between the measurement variances and covariances to deduce the error variances for each measurement. TC requires to have a long enough series of of collocated triplets of the measurements to obtain reasonable estimates of the second-order moments of 380 those measurements.
Besides, it is usually required that the three measurement systems are completely independent, with different space-time acquisition scales, and thus the so-called representativity error must be properly accounted for (Stoffelen, 1998;Hoareau et al., 2018).
Recently, a variant of TC especially adapted to deal with remote sensing measurements, has been introduced: The Correlated

385
Triple Collocation (CTC) (González-Gambau et al., 2020). When applying CTC, the data is assumed to have the same spacetime sampling, that is, they represent the same spatial and time scales. In contrast with standard TC, it is assumed that two of the datasets can have correlated errors (for instance, they are derived from the same basic measurement system). Besides, and considering that remote sensing series are typically not too long, CTC is optimized to provide reasonably good estimates of the error variances even with a limited number of samples. With those conditions, CTC can be used to obtain maps of error 390 variances of triples of remote sensing SSS maps and obtain, for example, a different map for every year.

395
The correlated triple collocation analysis helps to assess properly the differences existing between the derived satellite products. Figure 10 shows the estimated error standard deviation for each one of the three datasets. The differences between the products are shown in figure 11. Over the majority of the Arctic, BEC v3.1 has the smallest error, except in some specific regions where BEC v2.0 is better (Hudson Bay, east coast of Greenland, and Kara Sea). JPL 4.2 is in all cases the product with the greatest error.

Spectral Analysis
The analysis of spectral slopes permits obtaining information about the effective spatial resolution of the different remote sensing datasets. Theoretical studies have reported that Power Density Spectra (PDS) slopes are expected to range between -1 and -3, depending on the dynamical regime that drives the ocean (Blumen, 1978;Charney, 1978). Moreover, the presence of white noise makes the spectral slope tends to 0 (log PDS vs log wavenumber bend and become horizontal at high wavenumbers). In 405 contrast, when the spatial resolution of the data is over smoothed, a systematic lack of energy appears at high wavenumbers and a faster decay is observed on the spectral slope for the wavenumber larger than the effective resolution threshold. In this analysis, we use the value of the spectral slope k-2 (Blumen, 1978;Charney, 1978) as reference.
The spectral analysis approach has been applied as in Hoareau et al. (2018), over three regions:

425
In contrast, the BEC SSS v2.0 PDS (magenta line) is able to consistently describe the geophysical structures up to 250 km wavelength (PDS slope similar to -2). For smaller scales there is a faster decay of the PDS slope, indicating a loss of signal, specially in the Nordic Seas and the Bering Strait, probably due to an over smoothing in the optimal interpolation algorithm.
Therefore, the Arctic+ v3.1 data is the one containing the most consistent spatial representation at smaller scales, allowing a more accurate description of Arctic SSS processes.  The SSS maps are produced here by averaging only in time (9 days), but not in space (keeping the same T B resolution).
Therefore, there is no loss of effective spatial resolution as compared to T B. This finer spatial resolution is one of the main advantages of this product, as shown by the spatial spectral analysis (section 3.4). This new product is preferable to perform 440 studies of the Arctic ocean SSS processes and dynamics. Arctic+ SSS v3.1 product spans from 2011 to 2019 and consists in daily maps of 9-day averaged, in an equal-area grid at 25km (EASE 2.0 grid). Maps of 3-days and 18-days for the same period and grid are also served in the webpage, but those products have not specific validation results. Furthermore, the swath SSS product (L2) used to generate the maps is also available at the BEC webpage.

445
The conclusions of the validation procedure are summarized in the following points: -Arctic+ v3.1 has, in general, a better skill to describe horizontal SSS gradients than BEC v2.0, with better effective spatial resolution. This comes, however, at the price of an increase of bias and a larger uncertainty in some regions.
-Comparison with TARA datasets, Arctic+ v3.1 show good agreement with in situ data for some key areas, like the Beaufort Sea, which is one of those areas in the focus of the Arctic scientific community.

450
-Comparison with ARGO show that v3.1 has slightly larger RMSD, but present higher correlation with in situ data.
It has been stated that the high spatial resolution of v3.1 produce this larger RMSD when comparing with punctual measurements.
-The introduction of the correlated triple collocation also helps to assess properly the differences existing between the current (in 2021) derived satellite products. The metrics show that Arctic+ v3.1 dataset is the one of the three product 455 with the lowest errors in general except in Hudson Bay, east coast of Greenland, and Kara Sea. In particular, the triple collocation shows that SMAP data yields the largest errors.
-The results of the spatial spectral analysis confirm that Arctic+ v3.1 data is the one containing the most consistent spatial representation at smaller scales than SMAP and BEC v2.0, allowing a more accurate description of Arctic SSS processes.
The maps are distributed in the standard grid EASE-Grid 2.0, which has a spatial resolution of 25 Km. In addition to the product validated in this work (L3 with temporal resolution of 9 days), L3 products having a temporal resolution of 3, and 18 465 days and L2 product are available. These Arctic SSS products cover the period from 2011 to 2019.