1. Introduction
Since the 1970’s, the advancements in computer models and associated tools and techniques used to simulate groundwater systems have enhanced hydrogeologists’ understanding of the uncertainty in important simulated outcomes, and ultimately, the risks of decisions related to water supply planning, groundwater remediation and other commercial or academic pursuits. These models, tools and techniques collectively make up an important facet of water resource management decision support which aspire to simulate groundwater flow over space and time and the associated uncertainty in the simulation results. Appropriately representing the salient aspects of complex natural groundwater system with a necessarily simple numerical representation and then combining that simplified representation with formal inverse and data assimilation techniques to extract meaningful information from available state observations is one of the drivers behind the advancements in predictive groundwater modeling. Transformation from data to decisions has therefore been the underlying motivation for research in the area of decision support systems. In this regard, Ghislain de Marsily’s contribution to the advancement of the inverse methods, a key component of decision support systems, cannot be understated.
Prior to de Marsily’s doctoral research published in French in 1978, modelers had resorted to very simplistic representations of the hydrogeologic data obtained from field investigations due to the lack of techniques to appropriately leverage information embedded in the data. These simplistic techniques typically included the use of zones of pre-determined shape and of large spatial extent, which were assigned constant values for hydrogeologic properties such as transmissivity. The zones could not represent the expected (and necessarily stochastic) patterns of spatial heterogeneity often observed in hydrogeologic system properties at many scales. Therefore, the solutions from these early models were plagued with large and unknown errors initiating from the blocky, unrealistic representation of hydrogeologic properties used in the models and propagated to the decisions these models were meant to inform. The earliest attempts to use formal optimization methods to invert the solution of the groundwater flow equation were thwarted when it was discovered that errors in the hydraulic head measurements, and moreover in the hydraulic gradient, could lead to instabilities in the inverse solution [Emsellem and De Marsily 1971; Kleinecke 1971; De Marsily et al. 2000].
de Marsily’s doctoral research presented an alternative inverse method which overcame these challenges. He posed the coupling of a new field of research referred to as geostatistics, or spatial statistics, to represent the groundwater system property fields as continuously varying surfaces, thereby overcoming the known limitations of zonation-based schemes. He parameterized the inversion using an indirect inversion technique which enabled him to overcome the numerical instabilities from errors in the measured head data. Lastly, he also adopted the recently published adjoint technique [Chavent 1974] to efficiently compute sensitivities of the computed head field to changes in the parameter values enabling him to quickly obtain an inverse solution in high dimensions. de Marsily called his technique the “Pilot Point Method” (PPM) because of the way in which he used specific locations to impose changes in parameter field, locations he called “pilot points”. The pilot points would be added to the geostatistical kriging equations and their values assigned optimally to change or “warp” the modeled transmissivity field to reduce the error between the simulated groundwater levels and their measured counterparts. The PPM method fundamentally changed the way groundwater modelers approached the inverse method and is still ubiquitous in groundwater modeling today.
Herein, we provide a brief overview of the theory and history of the PPM, focusing on the important contributions of de Marsily. We finish with some brief remarks related to some of the more recent advances in the PPM and how it continues to serve the decision-support groundwater modeling community.
2. Concepts and theory of the PPM
In the PPM, the Bayesian viewpoint of the groundwater model inverse problem is adopted. Originally, the PPM focused on the estimation of the modeled transmissivity field. The log of measured transmissivity is used to transform the statistical distribution of transmissivity (considered to be logarithmic; [Law 1944; Sánchez-Vila et al. 1996]) to be Gaussian. The unknown log transmissivities in between measurements are viewed as variables of a Multi-Gaussian random function (RF). Log transmissivity at a point in space is considered a Gaussian random variable (RV) which may or may not have a distribution which is statistically dependent on, or correlated to, its neighboring transmissivities. A transmissivity measurement at location (x,y,z) is conceptually considered as a sample from the RV distribution at point (x,y,z). As presented by Matheron and Blondel [1962], Matheron [1963], kriging provides the best unbiased estimate of the RV at any point in space given observed data and the correlation between the data.
In de Marsily’s work, the RV was log transmissivity and the estimation points were the grid block centroids of his finite-difference groundwater flow model. He employed ordinary kriging (OK) to estimate the initial log transmissivity field. OK provides estimates of a RV at any point (or area) in space through the use of measurements of the RV already available and their covariance [see Chilès and Desassis 2018]. The OK estimator is given by:
(1) |
(2) |
The solution of the inverse problem using the PPM relies upon an expanded form of (1):
(3) |
(4) |
The sensitivity derivatives required to assign the optimal value of pilot-point transmissivities are computed with the adjoint technique [Chavent 1974] and the kriging equations (Equation (2)). The overall equation solved by the PPM may be presented as follows.
Let Zm represent the log transmissivity value assigned to grid block m. Using the chain rule, the sensitivity of the weighted least-squares objective function, 𝜙, to the log transmissivity value assigned to each pilot point may be expressed by:
(5) |
(6) |
(7) |
As mentioned earlier, de Marsily’s PPM was the first to solve many of the problems inherent in the earlier inverse techniques. By incorporating geostatistics and logarithms of transmissivity, he ensured a smooth but spatially varying transmissivity field even in the presence of significant errors in the head field. In addition, he stabilized the inverse solution by:
- adding prior information, e.g. regularization, in the form of initial kriged estimates at the pilot point locations,
- properly posing the inverse problem by reducing the number of unknowns to a small number of pilot points,
- implementing the transmissivity changes in the neighborhood of a pilot point in accord with the variogram, and
- utilizing efficient adjoint sensitivity techniques to obtain the necessary derivatives for the optimization routine.
3. Early PPM advances
The first mention of the PPM was in de Marsily’s PhD dissertation [De Marsily 1978], which describes using “pilot points” at discrete locations. One of the first applications of the PPM published in English was presented in De Marsily et al. [1984]. Here the PPM was used to obtain the permeability field which reproduced interference tests in a well field.
In the late 1980s and early 1990s, research of geostatistical methods by Ahmed and De Marsily [1987] and research of the inverse problem by Certes and De Marsily [1991] resulted in improvements to the PPM. Lavenue and Pickens [1992] coupled ordinary kriging and the PPM to history match a regional groundwater model at the Waste Isolation Pilot Plan (WIPP) site in Carlsbad, New Mexico. They also adopted an adjoint sensitivity method to optimally locate pilot points, which were sequentially added to the model during an iterative inversion process. One of the significant features added to the model transmissivities during inversion was a high-transmissivity fracture zone which provided a fast offsite pathway to the WIPP-site boundary potentially threatening the WIPP site safety assessment.
A technical review panel comprising the leading groundwater inversion researchers of the 1990s was convened by Sandia National Laboratories to review the WIPP model construction, inversion and travel time calculations. This panel, called the Geostatistics Expert Group (GXG), was headed by Ghislain de Marsily. The GXG recommended that the PPM approach be expanded to calibrate an ensemble of conditionally simulated transmissivity fields in order to investigate groundwater travel time uncertainty to the WIPP-site boundary. de Marsily, Lavenue, and RamaRao did so by linking a conditional simulation front-end routine (using turning bands), a flow model (SWIFT II), an adjoint sensitivity routine to optimally located pilot points and a conjugate gradient optimization routine to assign the pilot-point transmissivity values. In 1995, the resulting new PPM method referred to as GRASP-INV and its application to the WIPP site produced an ensemble of 100 conditionally simulated transmissivity fields [RamaRao et al. 1995; Lavenue et al. 1995].
This extension of the PPM method to produce an ensemble of solutions, all equally calibrated or “history matched” to the observed data, was a significant step forward in the quantification of posterior uncertainty, which ultimately allowed for a more complete risk assessment to support decision making. It also laid a foundation for the use of inversion and sensitivity techniques to assess data worth which guided future data acquisition campaigns at the WIPP site to further reduce uncertainties in our understanding of the hydrogeologic system.
Shortly afterward, Gomez-Hernandez and his fellow researchers at the University of Valencia, Spain, developed a non-linear pilot-point technique (or a “pilot-point-like” method as one reviewer pointed out) that differs from the original approach of De Marsily [1978] and Lavenue et al. [1995]. These works include Sahuquillo et al. [1992], Gómez-Hernánez et al. [1997], and Capilla et al. [1997]. The solution of the inverse problem using Gomez-Hernandez’ approach is significantly faster than that of Lavenue and Pickens [1992] or Certes and De Marsily [1991] for several possible reasons. First, Gomez-Hernandez did not update the sensitivity derivatives as frequently as the other pilot-point approaches. The second major difference in Gomez-Hernandez’s method is that he assumed the conductance between two finite-difference grid blocks is expressed by the geometric mean (as opposed to the harmonic mean) of the associated grid-block transmissivities, which significantly reduced the time required to compute sensitivities. The combined effect of these differences significantly sped up the inversion process using Gomez-Hernandez’s technique which he called the sequential self-calibration (SSC) pilot point method.
Sandia National Laboratory decided to benchmark GRASP-INV against other inversion techniques represented by the GXG members and invited Gomez-Hernandez to participate. This unique bench-marking exercise, described in Zimmerman et al. [1998], evaluated the performance of virtually every inversion technique used by industry and academics at that time. The ultimate objective of the study was to determine which of several geostatistical inverse techniques is better suited for making probabilistic forecasts of the potential transport of solutes in an aquifer where spatial variability and uncertainty in hydrogeologic properties are significant. Seven geostatistical methods were compared on four synthetic data sets. The inverse methods tested were categorized as being either linearized or nonlinear [Carrera and Glorioso 1991]. The linearized approaches are generally based upon simplifying assumptions about the flow field (e.g., a uniform hydraulic head gradient, a small log(T) variance, etc.), that lead to a linearized relation between T and head using a perturbation expansion of the head and transmissivity fields. This equation can then be solved analytically or numerically. The nonlinear approaches have no such restrictions placed on them and can, in principle, handle more complex flow fields or larger log(T) variances. Linearized methods included the Fast Fourier Transform inverse method by Gutjahr et al. [1994], the Linearized Cokriging Method by Kitanidis and Lane [1985], and the Linearized Semianalytical method by Rubin et al. [1992], while the non-linear methods included the Fractal Simulation inverse method by Grindrod and Impey [1991], the Maximum Likelihood inverse method by Carrera [1994], the Pilot Point Method [RamaRao et al. 1995] and the SSC pilot point method by Gomez-Hernandez [Gómez-Hernánez et al. 1997]. While the non-linear inverse methods were superior to the linearized inverse methods, the most robust results came from the inverse methods that were able to incorporate rigorous geostatistics into the inversion process, that is, inversion suffered when a technique could not adequately represent nor modify the expected spatial (e.g. Prior) patterns of variability of the aquifer properties.
Lessons learned in Zimmerman et al. [1998] regarding the importance of robust geostatistics led de Marsily, Lavenue and RamaRao to enhance their PPM by adopting the GSLIB [Deutsch and Journel 1992] routines for conditional simulation. They collaborated with Gomez-Hernandez in this effort which resulted in a new conditional simulator for the PPM which could produce an ensemble of multi-facies geostatistical fields with unique covariance structures using categorical indicator simulation followed by sequential gaussian simulation to overlay spatial variability within each facies. de Marsily, Lavenue and RamaRao linked this new capability to their PPM method and later applied it to the WIPP site regional model [Lavenue 1998]. Unique to this enhancement was the ability of the PPM to optimize the spatial variability within each facies (fractured and unfractured zones) separately to match transient water levels. It also enabled the delineation of a sharp fractured/unfractured zone interface within the regional aquifer which reduced travel time uncertainty. Lavenue and de Marsily then extended this version of the PPM to three dimensions and demonstrated its utility by matching a series of three-dimensional interference tests [Lavenue and De Marsily 2001].
Other research motivated by Zimmerman et al. [1998] included methods to enforce prior plausibility in a Bayesian sense, the work of Alcolea et al. [2006], Riva et al. [2010] and later Jiménez et al. [2016] worked to include expert knowledge into the PPM solution process so that the resulting property fields were adjusted in harmony with the prior estimates.
4. The tipping point
Notwithstanding the developments and advancements of the PPM cited above, its use was not widespread since the PPM was largely coupled to groundwater model software not widely used by industry. This changed in the late 1990s, when John Doherty, adopted the PPM for his PEST software suite and produced a full suite of advanced software tools that allowed the PPM to be applied at scale in an unlimited number of settings. Unlike the existing adjoint-based PPM approaches, the PEST software suite relies on non-intrusive/model-independent approaches, such as finite-difference derivatives to estimate the first-order relation between parameters and simulation results, so that it can be applied to practically any forward numerical groundwater model. Doherty realized that like PEST, the PPM can be implemented in a non-intrusive/model-independent approach, so that the PPM could also be applied to practically any forward numerical groundwater model. This shifted the use of the PPM from the bespoke research and development projects cited above to the broader groundwater modeling community, a tipping point of scaling the PPM to industry at large.
Within the PEST framework, the PPM serves as an indispensable parameterization scheme, one that balances the need to express uncertain spatial heterogeneity in model inputs with the requirement to maintain a computationally tractable inverse problem and represent broad-scale heterogeneity. We generalize from transmissivity to “model inputs” here because with the non-intrusive PEST framework, pilot points are commonly used to represent realistic spatial heterogeneity in a wide-range of uncertain properties and boundary condition elements that numerical models require. For example, Knowling and Werner [2016] and later Knowling and Werner [2017] used pilot points to represent uncertain spatial patterns of recharge, while McKenna et al. [2020] and later White et al. [2020a] used pilot points in a multi-scale parameterization scheme to explicitly represent different scales of heterogeneity and associated uncertainty.
The issue of inverse problem stability and posed-ness with the PPM were both overcome through the adoption of formal Tikhonov regularization [Tikhonov 1963; Tikhonov and Arsenin 1977] [see Tonkin and Doherty 2005; Fienen et al. 2009; Doherty et al. 2010; Doherty 2015], in combination with truncated subspace solution techniques [see Oliver et al. 2008; Aster et al. 2018; Doherty 2015]. These two advancements, which were implemented at scale to function in concert with each other for the first time in PEST software suite, allowed practitioners to forego the need to a priori reduce the number of pilot points to stabilize the inverse problem—Tikhonov regularization and truncated subspace techniques optimally and unconditionally stabilize the inverse problem. Now practitioners were free to express expected property heterogeneity (and associated uncertainty) at scales that are relevant to the predictive outcomes of the model.
5. Où En Sommes-Nous maintenant?
With the recent advancements in the PPM mentioned above, the combined use of the PPM and PEST has led to wide-spread investigation of predictive uncertainties in applied groundwater modeling, which is facilitated by the PPM allowance for a more complete and robust expression of expected patterns of spatial heterogeneity in subsurface properties [see for example Moore and Doherty 2005; James et al. 2009; Dausman et al. 2010; Herckenrath et al. 2011; Tonkin and Doherty 2009; Christensen and Doherty 2008; Keller et al. 2021].
The PPM, as embodied within Doherty’s PEST software suite, continues to serve an important role in decision-support environmental simulation around the world and new developments continue. Several recent advancements have focused on expanding the use of pilot points from 2-point geostatistics to multipoint geostatistics [see Li et al. 2013; Mariethoz and Caers 2014; Ma and Jafarpour 2018; Khambhammettu et al. 2020; Liu et al. 2021]. The capability of these pilot point methods to facilitate the use of multi-point geostatistical property representations within formal data assimilation is a major advancement in settings where high-order multi-point geostatistical property representations are important, such as settings where connected permeability are important for representing the fate and transport of dissolved-phase constituents.
So “Where are We Now?” Groundwater modelers have more computational power than ever before to build highly complex models and invert them with PEST, and more recently, the PEST++ software suite [White et al. 2020b]. With the related advancements in history matching using the iterative ensemble smoother approaches [e.g. Chen and Oliver 2013; White 2018], practitioners are able to more efficiently employ high-dimensional inversion and uncertainty quantification in non-intrusive ways and at scales not previously possible except for those with access to high-performance computing. This “guilded age” of inversion however comes with the responsibility of parsimony. It cannot be overstated that sound groundwater model analyses are built on the foundation of a solid hydrogeologic conceptual model that is focused on the predictive purpose of the modeling, and a firm grasp of the sources and magnitudes of model input (i.e. parameter) and conceptual model uncertainty. Decisions based on the associated numerical model can only then take full advantage of the decision-support simulation tools discussed herein.
A final comment: When asked about his upcoming conference presentation in the early 1980s, de Marsily mentioned that his talk was on applying geostatistics in hydrogeology, which at the time was virtually unknown in the US. He went on to say that he felt it was important to explain to the audience how powerful geostatistical tools are to hydrogeologists and that it “was never too late to learn new things”. He was right on both points. Countless hydrogeologists have been positively impacted by the “new things” Ghislain de Marsily taught us and by his example of leading with humility, style and grace. Thank you, Ghislain.
Conflicts of interest
Authors have no conflict of interest to declare.