Heuristic Approach to Multidimensional Temporal Assignment 1 of Spatial Grid Points for Effective Vegetation Monitoring and Land 2 Use in East Africa 3 4

Abstract. In this research, vegetation trends are studied to give valuable information toward effective land use in the East African region, based on the Normalized Difference Vegetation Index (NDVI). Previously, testing procedures controlling the rate of false discoveries were used to detect areas with significant changes based on square regions of land. This paper improves the assignment of grid points (pixels) to regions by formulating the spatial problem as a multidimensional temporal assignment problem. Lagrangian relaxation is applied to the problem allowing reformulation as a dynamic programming problem. A recursive heuristic approach with a penalty/reward function for pixel reassignment is proposed. This combined methodology not only controls an overall measure of combined directional false discoveries and nondirectional false discoveries, but make them as powerful as possible by adequately capturing spatial dependency present in the data. A larger number of regions are detected, while maintaining control of the mdFDR under certain assumptions. Data Link: https://figshare.com/s/ed0ba3a1b24c3cb31ebf DOI: https://figshare.com/articles/NDVI_and_Statistical_Data_for_Generating_Homogeneous_Land_Use_Recommendations/5897581


Introduction
Analysis of vegetation life cycles is fundamental in monitoring and planning agricultural endeavors and optimizing land use.In particular, gaining knowledge of current vegetation trends and using them to make accurate predictions is essential to minimize times of food scarcity and manage the consumption of natural resources in underdeveloped countries.Needing to understand the Earth's ecology and land cover is increasingly important as the impacts of climate change start to affect animal, plant, and human life.Vegetation trends are also closely related to sustainability issues, such as management of conservation areas and wildlife habitats, precipitation and drought monitoring, improving land usage for livestock, and finding optimum agriculture seeding and harvest dates for crops.
For this reason, there are many agencies and organizations that focus on the study of land use and land cover trends, linking them to climate change and the socioeconomic consequences of these changes.The United States Global Manuscript under review for journal Earth Syst.Sci.Data Discussion started: 24 April 2018 c Author(s) 2018.CC BY 4.0 License.plot.After creating such sub-regions, two-sided monotonic trend tests from Brillinger (1989) were used to identify significant increasing or decreasing monotonic vegetation changes based on these arbitrarily chosen square regions of land.They demonstrated that this screening procedure controlled the mixed directional false discovery rate (mdFDR), which is defined as the expected proportion of Types I errors (False Positives) and Type III errors (Directional errors) among all rejected null hypotheses, introduced by Benjamini & Yekutieli (2005).
In this article, we utilize the same historic NDVI time series for East Africa from 1982 to 2006.Since real-time monitoring for change is not part of the scope, we focused improving the methodologies previously used to identify significant changes in land cover in the region.We do this by first framing the research question as an NP-hard temporal multi-objective assignment problem.Using heuristics to solve this problem, we first find improved subregions than the previous arbitrarily chosen square grids.Using this approach allows us to adequately capture the specific data structure and answer questions in the present context.Secondly, we reapply the multiple testing procedures in Clements, et al., (2014) and demonstrate that the testing procedure become more powerful while still maintaining control an error rate, the mdFDR.In summary, our methods aim to incorporate spatial local dependencies using a multi-dimensional assignment problem formulation to improve sub-region formation, which in turn improves the multiple testing results.
We organize the paper as follows.In the next section, we give a review of the literature followed by a detailed description of the historical data set.We then describe the temporal assignment problem formulation to create more homogeneous sub-regions and explain the heuristic procedure using dynamic programming.Next, we apply the multiple testing procedures to the improved sub-regions.Finally, we reveal the results of the model implementation, followed by a discussion, conclusions, and final remarks.

Multiple Testing Overview
To control over false vegetation trend detections, multiple testing procedures can be employed.An overview of multiple testing notation and procedures are described next.When testing a single null hypothesis against a twosided alternative, two types of error can occur when a directional decision is made following rejection of the null hypothesis.These are Type I error and Type III (or directional) errors.The Type I error occurs when the null hypothesis is falsely rejected, while the Type III error occurs when the null hypothesis is correctly rejected but a wrong directional decision is made about the alternative.
Consider testing n hypotheses simultaneously, such as testing for trend changes in n pixels over the East African region.One of the most commonly used measures of overall Type I error is called the Familywise Error Rate (FWER).The FWER is the probability of making one or more Type I errors.In other words, out of n simultaneously tested hypotheses, where V is the number of Type I errors made out of n decisions (recall: V is an unknown quantity), then FWER =Prob V 0 .In the case of multiple hypothesis testing, the FWER should be controlled at a desired overall level, called α.The Bonferroni procedure is the most popular method to control the FWER, but there are other techniques, such as those in Holland & Copenhaver (1987), Hochberg & Tamhane (1987), Šidák (1967), Holm (1979), Hochberg (1988), Sarkar (1998), andSarkar &Chang (1997).
The False Discovery Rate (FDR), proposed by Benjamini and Hochberg (1995), is the second most common measure of Type I errors.The FDR is the expected proportion of Type I errors among all the rejected null hypotheses.If there are no rejected hypotheses, the FDR is defined to be zero.In terms of Table 1, FDR E V max R, 1 .Comparatively, the FDR is less conservative than the FWER, meaning FWER control ensures FDR control.However, a multiple testing procedure with FDR control will not necessarily maintain control of the FWER.The FDR is a widely accepted and utilized notion of Type I errors in large-scale multiple testing investigations.Recent literature has proposed methods to control the FDR, including Benjamini and Hochberg (1995), Benjamini and Yekutieli (2001), Sarkar (2002), Blanchard and Roquain (2009), Storey, Taylor, andSiegmund (2004), andBenjamini, Krieger, andYekutieli (2006).
Often, it becomes essential for researchers to determine the direction of significance, rather than significance alone, when testing multiple null hypotheses against two-sided alternatives.In other words, for each test, researchers have to decide whether or not the null hypothesis should be rejected and, if rejected, determine the direction of the alternative.Typically, this direction is determined based on the test statistic falling in the right-or left-side of the rejection region.Such decisions can potentially lead to one of two types of error for each test resulting in rejection of the null hypothesis -the Type I error if the null hypothesis is true or the directional error, also known as the Type III error, if the null hypothesis is not true but the direction of the alternative is falsely declared (i.e. a rejection of a false null using a two-sided alternative, but where the sign of the true parameter, say , is opposite of its estimate ).
Two variants to deal with Type I and Type III errors have been introduced in the literature.First is the pure directional FDR (dFDR), which is the expected proportion of directional errors among rejected hypotheses.Second is the mixed directional FDR (mdFDR), which is the expected proportion of Type I and Type III errors among rejected hypotheses.To deal with both errors in an FDR framework, the notion of mixed directional FDR (mdFDR) was been introduced by Benjamini et al. (1993).Since then, other methods to control directional errors have been introduced, including Benjamini and Yekutieli (2005), Benjamini and Hochberg (2000), Shaffer (2002), Williams et al. (1999), Guo et al. (2009), and Sarkar and Zhou (2008).
Controlling both false discoveries (V, from Table 1) and directional false discoveries (U, from Table 1) is important in this application.For instance, when declaring a particular 8,000 m × 8,000 m grid of land as 'significantly' changing in terms of vegetation, a Type I error is made if the area is not truly changing, and a Type III error is made if the area is truly changing but in the opposite direction of what is determined from the data.When such decisions are made simultaneously based on testing multiple hypotheses, one should adjust for multiplicity and control an overall measure of Types I and III errors.Without such multiplicity adjustment, more Types I and III errors can occur than the desired α level.It is particularly important to avoid these errors as much as possible in the present application.Land use managers, government and local farmers are looking to relocate East African populations of people, livestock and crops to areas of promising vegetation changes and avoid regions with decreasing changes.
Since these migrations can be risky and costly, a careful consideration of the multiplicity issue seems essential when making declarations of significant vegetation changes.
In this article, p-values generated using the monotonic trend test in Brillinger (1989) are computed for each site (8,000 m × 8,000 m grid of land) and provide evidence of vegetation change occurring over the years-the smaller the p-value, the higher is the evidence of a significant vegetation change.For each site, a decision must be made regarding the significance of vegetation change that might have occurred over the years at that site, and, if vegetation change is found significant, determine the direction in which this change has taken place.This must be done simultaneously for all sites (≈50,000) in the East African region in a multiple testing framework designed to ensure a control over a meaningful combined measure of statistical Types I and III errors.
In this paper, we will first be framing the research question as a heuristic multi-objective temporal assignment problem, in which better sub-regions were created than the arbitrarily chosen square grids in Clements et.al. (2014).
By using temporal assignments to create subregions, we will demonstrate that the testing procedure becomes more powerful.Also in this article, we provide theoretical proof that the mdFDR is still controlled under sub-region independence.There is a wealth of research on assignment problems and specialized assignment problems that display complicating constraints.Though the generalized assignment problem is solvable, once the number of dimensions reaches 3, as in the formulation presented in this paper, this is no longer the case.

Temporal Assignment Problem Overview
The multidimensional assignment problem was introduced by Pierskalla (1968) and a bibliography of multidimensional assignment problems was prepared by Gilbert & Hofstra (1988).Miori (2011Miori ( , 2008Miori ( , 2014) ) used assignment problems to model truckload routing problems and the Pollyanna gift exchange problem.Scheduling medical residents with the temporal component was addressed by Franz & Miller (1993).Bandelt, Et al. (1994, 2004) addressed multi-dimensional assignment problems with decomposable costs.The three-dimensional assignment problem was applied to teaching schedules by Frieze & Yadegar (1981) and Balas & Saltman (1991).
Multidimensional approximation was applied to capacity expansion problems by Troung & Roundy (2011).
Lagrangian Relaxation was applied to a multi-dimensional assignment problem arising from multi-target tracking by Poore & Rijavec (1993).Multi-tracking data was also addressed by Robertson (2001).
Approximations to the multi-dimensional assignment problem were generated by Kuroki & Matsui (2007), Gutin, Et al. (2008), Krokhmal, Et al. (2007), and Karapetyan &Gutin (2011).The multi-objective assignment problem seeking solutions to the assignment problem in the face of additional objectives using efficient sets was posed by White (1984).A weighting function approach has also been applied to multi-objective (multicriteria) problems with conflicting objectives by Phillips (1987).

Land Use Optimization Overview
The most basic methods in land use optimization involve limited enumeration of alternatives and developing metrics to directly assess these alternatives.Landscape metrics addressing various land use goals were used by Kuchma, Et al. (2013) to evaluate enumerated options for land use.A similar approach was proposed by Wang & Guldmann (2015) to mitigate seismic damages in Taichung, Taiwan.

Data Description
East Africa spans a wide variety of climate types and precipitation regimes which are reflected in its vegetation cover.To capture this, satellite imagery was collected over a sub-Saharan region of East Africa that includes five countries in their entirety (Kenya, Uganda, Tanzania, Burundi and Rwanda) and portions of seven countries (Somalia, Ethiopia, South Sudan, Democratic Republic of Congo, Malawi, Mozambique and Zimbabwe).This roughly 'rectangular' region extends from 27.8°E to 42.0°E longitude and 15.0°S to 6.2°N latitude.Also included in the region are several East African Great Lakes such as Lake Victoria, Lake Malawi and Lake Tanganyika.All the negative NDVI values were consolidated to zero, as commonly done in vegetation monitoring, and re-scaled the remaining values by 1,000.Negative NDVI values indicate non-vegetation areas, and so they are of no use in our statistical analysis.Prior to the analysis, we examined the data for quality assurance and eliminated a small number of pixels that were found to have several consecutive years with identical data values, which may be due to data entry errors or machine malfunction.
When this data was first examined in Vrieling, de Beurs and Brown (2008), the percentage of pixels with the trend test p-value less than α = 0.10 was reported separately for positive and negative slopes.The reported results indicate that much of the region has 'significant' vegetation change.For example, the cumulative NDVI indicator detected 44.2% of sites with p-values less than 0.10.However, this result fails to address the important statistical issue of multiplicity when making these claims about significant vegetation changes and their directions simultaneously for all the regions based on hypothesis testing.Later, Clements, et. al. (2014) addressed the multiplicity issue by proposing a 3-stage multiple testing procedure to control the mixed-directional False Discovery Rate (mdFDR), but did so on subregions of East Africa that were not optimally formed.
The associated csv file for this analysis is the information generated from Clements, et al, (2014)   pval: Resulting p value from the Brillinger Trend Test (Brillinger, 1989)  slp: Resulting slope from the Brillinger Tren Test (Brillinger, 1989)  block: Block number -initial assignment was arbitrary Using the algorithm below, followed by the multiple testing procedure, users may generate the revised and improved block assignments.

Assignment Problem Formulation
We propose an assignment formulation to this problem, using these analysis results, with the goal of an improved solution.The object of the geographic assignment problem is to map each pixel within the satellite images to an Note that pixels may be formed entirely of water; these pixels have been assigned arbitrarily high NDVI values to effectively eliminate them from consideration in the block assignments.A 'water block' with an arbitrarily high target value ensures that all of these pixels may be assigned to blocks.
The objective of the pixel assignment problem is to minimize the NDVI difference function.Let m = the number of pixels, let n = the number of blocks, and let T = the number of time periods.The decision variable x is a binary variable that represents the assignment, or lack of assignment, of pixel i to block j at time k.The constraints formulated ensure that each pixel is assigned to a block, during each period of time.The formulation in Eq. ( 1) -( 3) The binary decision variables utilize three indices, rendering the problem NP hard.We therefore propose and employ a heuristic approach that relies heavily on dynamic programming.

Lagrangian Relation
Restatement of the pixel assignment problem as a Markov Process will facilitate alternative solution methodologies.
We present a Lagrangian relaxation of the formulation and introduce a Lagrangian multiplier φ for the single constraint to be relaxed in each time period k 1, ⋯ , T. We include a simplifying assumption that the penalty is constant over all time periods and is denoted as φ The revised formulation is presented in Eq. ( 4) -( 5).

Dynamic Programming Formulation
The pixel assignment decisions may be made in stages, and while the outcome of each decision is not fully predictable, it can be observed before the next decision is made.We begin the dynamic programming formulation by organizing the problem into a tree structure (Fig. 2) reflecting pixels and levels (time increments).Each level of the tree corresponds to a time increment, beginning with time 0 which represents the first satellite images retrieved within the data set and ending at the final images at time T-1 and the pixels in each level number from 1 to m.The tree provides a discrete-time dynamic system.To calculate expected cost-to-go, we must also identify and calculate transition probabilities.In doing so, we consider only the current level (time period).The Markov Property (6) allows us to omit consideration of the probabilities of the path leading to the current level.The tree may now be viewed as a finite Nonhomogeneous Markov Process with transition probability matrix P representing transitions at any level.

| , … , | 6
The objective of the dynamic programming formulation is the minimization of the sum of cost at the current stage, and the cost-to-go (the best case to be expected from future stages).The notation required for the formulation follows.100 blocks, the probabilities would have a very small order of magnitude and an expectation of high levels of inaccuracy, resulting in a lack of ability to detect meaningful differences.We present a heuristic, rooted in dynamic programming principles to render an efficient and useful solution to the pixel assignment problem.

Recursive Heuristic Procedure
Due to the original assignment problem being NP hard, and the dynamic programming approach resulting in extreme computational and structural complexity, we introduce a heuristic method that leverages knowledge gained in the assignment and dynamic programming approaches.This heuristic also leverages the previous research completed in controlling the mdFDR.
The heuristic procedure was initialized with the 150 blocks used in Clements et. al (2014) and 56,355 total pixels, and utilized the previously calculated slopes and resulting p-values from monotonic trend tests.Rather than assigning the pixels to blocks over the duration of the 25-year span of the data collection as the assignment formulation would, this approach focused on assignment at the final observations in the 25th year but the use of slope and p-value allowed the approach to reflect the trends that occurred over time.This same approach could be used at any time during the study, reflecting all previous data.
The heuristic performance metric, like the objective function in the pixel assignment problem, required the calculation of block values corresponding to the pixel values.The metric leverages the initial random blocks by including the block average NDVI, the block average slope, the block average p-value, and the slope change indicator variable.Notation is introduced in Table 3, followed by the formulation of the performance metric.
Let f = pixel number and let g = block number and let 1 if pixel is assigned to block 0 if pixel is not assigned to block .
The minimum value of the performance metric in Eq. ( 13) determines the highest quality heuristic solution.Pixels whose current assignment leaves them on the border between blocks are evaluated.The metric is calculated for their incumbent (current) assignment and their prospective assignment(s).The pixel is then assigned to the block yielding the lowest value of the metric.As pixels are reassigned, newly exposed border pixels are evaluated in the same fashion.This procedure continues until all border pixels belong to the block with the best fit.
The dynamic programming concept of forward and backward passes has been adapted for the heuristic to compensate for directional bias in the results.In this way, all border pixel assignments may be evaluated in all directions.Four starting points and starting directions are identified in Fig. 3.

Reassignment Model and Implementation
An approach inspired by dynamic programming was utilized to find the best solution to the heuristic problem based on weight factors that varied between 0 to 1, under the condition that ∑ w 1. Table 4 shows a subset of the factor weight combinations that were examined.As seen in Table 4, selecting the solution with factor scores of w 1, w 0, w 0, and w 0 generates the smallest value of the performance metric in Eq. ( 13).Since factor 1 represents the NDVI average value at the final observation, this solution suggests performing pixel reassignment based solely on NDVI information with no weight applied to factors such as slope and p-value.The average score function of initial arbitrary square grid solution (calculated to be 0.1339) was compared to the proposed reassignment solution (calculated to be 0.0998), and yielded an improvement of 25.5%.
[Table 4 near here] The spatial map in Fig. 5 visualizes the initial arbitrary block assignment using square grids (left) compared to the final solution (right) that gave the minimum value of the performance metric in Eq. ( 13).The contrast in maps reveals how the solution to the pixel assignment problem created natural looking clusters of differing sizes.For example, along some coastline areas, clusters are long and narrow.This is intuitive because NDVI values tend to be Earth Syst.Sci.Data Discuss., https://doi.org/10.5194/essd-2018-18 Open  An unbiased validation of the reassignment solution can be calculated using the average coefficient of variation for the final pixel assignment and compare it to the initial square block assignment.The coefficient of variation (CV) is a unit-less measure of spread that describes the amount of variability relative to the mean.CV is defined as the ratio of standard deviation over the mean.Smaller values of CV indicate higher homogeneity of the clusters.The average of cluster's coefficients of variations for our final pixel assignment solution is 11.762.This is a 27.4% decrease compared to the average coefficient of variation for the original square blocks, which was 16.205.This is a statistically significant difference in CV averages (p=0.000529),providing further evidence that the pixel reassignment solution was able to increase the level of homogeneity within clusters.Having homogeneous clusters is important when making large scale decisions about regions in East Africa that have experienced significant vegetation trend changes.

Multiple Testing Implementation and Results
Earth Syst.Sci.Data Discuss., https://doi.org/10.5194/essd-2018-18 Open Now we can assume that the pixels in the East African region are divided into homogeneous subregions using temporal assignments, as described above.Next, we summarize and apply the multiple testing procedure given in Clements, et. al. (2014).
For notation, let m be the number of such subregions and n be the number of pixels/locations in the i subregion.
P-values at each location were calculated using a two-sided monotonic trend test at each location using the Brillinger (1989) test.Specifically, we denote β as the monotonic trend parameter as defined in the Brillinger test for the i subregion and j location, where i 1, 2, … , m, j 1, 2, … , n .We also let T and P be, respectively, the test statistic and the corresponding p-value for testing the null hypothesis H : β 0 against its two-sided alternative H : β 0.
We apply Clements, et. al. (2014) suggestion of using a Bonferroni correction at each subregion, which combines the p-values by calculating P n min P .With H representing the null hypothesis corresponding to P , consider H ⋂ H as the null hypothesis corresponding to i subregion.We will test the H 's against their respective two-sided alternatives and detect the direction of the alternatives for the rejected hypotheses.
Specifically, we apply the procedure using α=0.05 in the following three steps: Multiple Testing Procedure Applied to Homogeneous Sub-regions: 1) Apply the BH method to test H , i 1, 2, … , m, based on their respective p-values P , P , … , P as follows: consider the increasingly ordered versions of the P 's, P P … P .Find max : P .
Reject the H 's for which the p-values are less than or equal to P , provided this maximum exists, otherwise, accept all H .
3) For each rejected H in step 2, decide the direction of the monotonic trend to be the same as that of ).
Step 1 and 2 identify first, the subregions and second, the locations with significant vegetation changes.The third step allows one to make a more detailed analysis by identifying the directions in which these significant changes have occurred.Impressively, this procedure controls the mdFDR at level α if the subregions are independent.A mathematical proof of this is given in the Appendix.
The results of implementing this procedure to our homogenous subregions are shown in Earth Syst.Sci.Data Discuss., https://doi.org/10.5194/essd-2018forjournal Earth Syst.Sci.Data Discussion started: 24 April 2018 c Author(s) 2018.CC BY 4.0 License.
Vegetative analysis in this region is of interest for a variety of reasons, including the importance of the region for global biodiversity and the vulnerability of the region to climate change, deforestation of the Congo, urban development, civil conflict, and agricultural practices.

Figure 1
Figure1The study area, as indicated by the box.
which was the initial starting point for this analysis.It contains the following fields:  site: Consecutive ID number, acting as a unique identified  xcoord: pixel longitude  ycoord: pixel latitude  ndvi.avg:Overall pixel average NDVI from 1982 to 2006 (observations taken twice monthly) appropriate block based upon a target value for each block.The block target values represent equal size ranges within the overall range of the objective function values.The objective function for the pixel assignment is the sum Earth Syst.Sci.Data Discuss., https://doi.org/10.5194/essd-2018forjournal Earth Syst.Sci.Data Discussion started: 24 April 2018 c Author(s) 2018.CC BY 4.0 License. of absolute difference between the pixel NDVI and the block target NDVI.The number of blocks is set objectively and may be reset for each assignment problem solution generated.
review for journal Earth Syst.Sci.Data Discussion started: 24 April 2018 c Author(s) 2018.CC BY 4.0 License.

Figure 2
Figure 2 General Tree Structure.

•
Pixel i p-value over time: i 1, ⋯ , p-value range for block g Weight for scoring factor d: d 1, ⋯ 4 review for journal Earth Syst.Sci.Data Discussion started: 24 April 2018 c Author(s) 2018.CC BY 4.0 License.Development of the performance metric required definition of the block average values for NDVI, slope and p-value shown in Eq ( Fig. 4 shows the four passes to be completed for the first starting direction (upper left-hand corner).The first two passes are the forward direction evaluation and the second two passes are the backward direction evaluation.These same four passes are adapted for each starting point/direction, with the first pass always corresponding to the starting position.Earth Syst.Sci.Data Discuss., https://doi.org/10.5194/essd-2018forjournal Earth Syst.Sci.Data Discussion started: 24 April 2018 c Author(s) 2018.CC BY 4.0 License.

Figure 3
Figure 3 Starting Directions for Evaluation of Pixel Assignments.
review for journal Earth Syst.Sci.Data Discussion started: 24 April 2018 c Author(s) 2018.CC BY 4.0 License.similar along the coast where many areas are comprised of sand and rock.In other areas, clusters became circular and cover vast areas of known deserts in the East African regions.Small clusters also exist in the solution and, after investigating, we found that many of these clusters comprise of cities and urban areas that have little vegetation.It is logical that such pixels should be reassigned into the same cluster.

Figure 5
Figure 5 Initial arbitrary block assignment (left) compared to final solution (right).
review for journal Earth Syst.Sci.Data Discussion started: 24 April 2018 c Author(s) 2018.CC BY 4.0 License.

Fig. 6 .
Sites with a significant increasing change in vegetation are plotted in green.Sites with significant negative vegetation change are plotted in red.The nonsignificant sites are represented by white.Using the temporal reassignment to form Earth Syst.Sci.Data Discuss., https://doi.org/10.5194/essd-2018oftrue null hypotheses among the total null hypotheses in the subregion.We now work with the dFDR.Let sign representing the true sign of the Brillinger's monotonic trend parameter location in the subregion and is the test statistic.Now, U can be expressed as follows: is the cumulative distribution function of the standard normal.Therefore, assuming without any loss of generality that 0 when = 1, we have, for such , Sci.Data Discuss., https://doi.org/10.5194/essd-2018forjournal Earth Syst.Sci.Data Discussion started: 24 April 2018 c Author(s) 2018.CC BY 4.0 License.

Table 1 : Multiple Testing outcomes from testing n hypotheses
Table 1 gives the various outcomes of these tests, where H : θ θ is the null hypothesis and H : θ θ is the two-sided alternative, for i 1, 2, … , n.Of these quantities in Table 1, only n, A, and R (where R R R ) are known after applying a particular testing procedure.The number of Type I errors, Type II errors, and Type III errors are V V V , T T T , and U S respectively.All three quantities are unknown but Earth Syst.Sci.Data Discuss., https://doi.org/10.5194/essd-2018-18DiscussionsManuscript under review for journal Earth Syst.Sci.Data Discussion started: 24 April 2018 c Author(s) 2018.CC BY 4.0 License.desirablysmall.Most multiple testing procedures focus on controlling V in some capacity.In this paper, we utilize a procedure that controls V and .