My library button
  • No image available

  • No image available

    A critical component of the State Water Resource Control Board's Groundwater Ambient Monitoring and Assessment (GAMA) Program is to assess the major threats to groundwater resources that supply drinking water to Californians (Belitz et al., 2004). Nitrate concentrations approaching and greater than the maximum contaminant level (MCL) are impairing the viability of many groundwater basins as drinking water sources Source attribution and nitrate fate and transport are therefore the focus of special studies under the GAMA program. This report presents results of a study of nitrate contamination in the aquifer beneath the City of Livermore, where high nitrate levels affect both public supply and private domestic wells. Nitrate isotope data are effective in determining contaminant sources, especially when combined with other isotopic tracers such as stable isotopes of water and tritium-helium ages to give insight into the routes and timing of nitrate inputs to the flow system. This combination of techniques is demonstrated in Livermore, where it is determined that low nitrate reclaimed wastewater predominates in the northwest, while two flowpaths with distinct nitrate sources originate in the southeast. Along the eastern flowpath, {delta}{sup 15}N values greater than 10{per_thousand} indicate that animal waste is the primary source. Diminishing concentrations over time suggest that contamination results from historical land use practices. The other flowpath begins in an area where rapid recharge, primarily of low nitrate imported water (identified by stable isotopes of water and a tritium-helium residence time of less than 1 year), mobilizes a significant local nitrate source, bringing groundwater concentrations above the MCL of 45 mg NO{sub 3} L{sup -1}. In this area, artificial recharge of imported water via local arroyos induces flux of the contaminant to the regional aquifer. The low {delta}{sup 15}N value (3.1{per_thousand}) in this location implicates synthetic fertilizer. Geochemical modeling supports the hypothesis of separate sources, one including organic carbon, as from animal waste, and one not. In addition to these anthropogenic sources, natural nitrate background levels between 15 and 20 mg NO{sub 3} L{sup -1} are found in deep wells with residence times greater than 50 years.

  • No image available

    Nitrate is the number one drinking water contaminant in the United States. It is pervasive in surface and groundwater systems, and its principal anthropogenic sources have increased dramatically in the last 50 years. In California alone, one third of the public drinking-water wells has been lost since 1988 and nitrate contamination is the most common reason for abandonment. Effective nitrate management in groundwater is complicated by uncertainties related to multiple point and non-point sources, hydrogeologic complexity, geochemical reactivity, and quantification of denitrification processes. In this paper, we review an integrated experimental and simulation-based framework being developed to study the fate of nitrate in a 25 km-long groundwater subbasin south of San Jose, California, a historically agricultural area now undergoing rapid urbanization with increasing demands for groundwater. The modeling approach is driven by a need to integrate new and archival data that support the hypothesis that nitrate fate and transport at the basin scale is intricately related to hydrostratigraphic complexity, variability of flow paths and groundwater residence times, microbial activity, and multiple geochemical reaction mechanisms. This study synthesizes these disparate and multi-scale data into a three-dimensional and highly resolved reactive transport modeling framework.

  • No image available

    Nitrate is the number one drinking water contaminant in the United States. It is pervasive in surface and groundwater systems, and its principal anthropogenic sources have increased dramatically in the last 50 years. In California alone, one third of the public drinking-water wells has been lost since 1988 and nitrate contamination is the most common reason for abandonment. Effective nitrate management in groundwater is complicated by uncertainties related to multiple point and non-point sources, hydrogeologic complexity, geochemical reactivity, and quantification of dentrification processes. In this paper, we review an integrated experimental and simulation-based framework being developed to study the fate of nitrate in a 25 km-long groundwater subbasin south of San Jose, California, a historically agricultural area now undergoing rapid urbanization with increasing demands for groundwater. The modeling approach is driven by a need to integrate new and archival data that support the hypothesis that nitrate fate and transport at the basin scale is intricately related to hydrostratigraphic complexity, variability of flow paths and groundwater residence times, microbial activity, and multiple geochemical reaction mechanisms. This study synthesizes these disparate and multi-scale data into a three-dimensional and highly resolved reactive transport modeling framework.

  • No image available

    The California Water Resources Control Board, in collaboration with the US Geological Survey and Lawrence Livermore National Laboratory, has implemented a program to assess the susceptibility of groundwater resources. Advanced techniques such as groundwater age dating using the tritium-helium method, extensive use of oxygen isotopes of the water molecule ({delta}{sup 18}O) for recharge water provenance, and analysis of common volatile organic compounds (VOCs) at ultra-low levels are applied with the goal of assessing the contamination vulnerability of deep aquifers, which are frequently used for public drinking water supply. Over 1200 public drinking water wells have been tested to date, resulting in a very large, tightly spaced collection of groundwater ages in some of the heavily exploited groundwater basins of California. Smaller scale field studies that include shallow monitoring wells are aimed at assessing the probability that nitrate will be transported to deep drinking water aquifers. When employed on a basin-scale, groundwater ages are an effective tool for identifying recharge areas, defining flowpaths, and determining the rate of transport of water and entrained contaminants. De-convolution of mixed ages, using ancillary dissolved noble gas data, gives insight into the water age distribution drawn at a well, and into the effective dilution of contaminants such as nitrate at long-screened production wells. In combination with groundwater ages, low-level VOCs are used to assess the impact of vertical transport. Special studies are focused on the fate and transport of nitrate with respect to vulnerability of aquifers in agricultural and formerly agricultural areas.

  • No image available

    For the last several years, the Underground Test Area (UGTA) program has funded a series of studies carried out by scientists to investigate the role of colloids in facilitating the transport of low-solubility radionuclides in groundwater, specifically plutonium (Pu). Although the studies were carried out independently, the overarching goals of these studies has been to determine if colloids in groundwater at the NTS can and will transport low-solubility radionuclides such as Pu, define the geochemical mechanisms under which this may or may not occur, determine the hydrologic parameters that may or may not enhance transport through fractures and provide recommendations for incorporating this information into future modeling efforts. The initial motivation for this work came from the observation in 1997 and 1998 by scientists from Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL) that low levels of Pu originally from the Benham underground nuclear test were detected in groundwater from two different aquifers collected from wells 1.3 km downgradient (Kersting et al., 1999). Greater than 90% of the Pu and other radionuclides were associated with the naturally occurring colloidal fraction (

  • No image available

    The Department of Toxic Substance Control (DTSC) requested that Lawrence Livermore National Laboratory (LLNL) evaluate the treatment process currently employed at the Department's Stringfellow Superfund Site Pretreatment Plant (PTP) site to determine if wastes originating from the site were properly managed with regards to their radioactivity. In order to evaluate the current management strategy, LLNL suggested that DTSC characterize the effluents from the waste treatment system for radionuclide content. A sampling plan was developed; samples were collected and analyzed for radioactive constituents. Following is brief summary of those results and what implications for waste characterization may be made. (1) The sampling and analysis provides strong evidence that the radionuclides present are Naturally Occurring Radioactive Material (NORM). (2) The greatest source of radioactivity in the samples was naturally occurring uranium. The sample results indicate that the uranium concentration in the filter cake is higher than the Granular Activated Carbon (GAC) samples. (11 -14 and 2-6 ppm respectively). (3) No radiologic background for geologic materials has been established for the Stringfellow site, and comprehensive testing of the process stream has not been conducted. Without site-specific testing of geologic materials and waste process streams, it is not possible to conclude if filter cake and spent GAC samples contain radioactivity concentrated above natural background levels, or if radionuclides are being concentrated by the waste treatment process. Recommendation: The regulation of Technologically Enhanced, Naturally Occurring Radioactive Materials (T-NORM) is complex. Since the results of this study do not conclusively demonstrate that natural radioactive materials have not been concentrated by the treatment process it is recommended that the DTSC consult with the Department of Health Services (DHS) Radiological Health Branch to determine if any further action is warranted. If it were deemed desirable to establish a background for the Stringfellow setting LLNL would recommend that additional samples be taken and analyzed by LLNL using the same methods presented in this report.

  • No image available

    The purpose of this report is to assess the decay and in-growth of radionuclides from the radionuclide source term (RST) deposited by underground nuclear weapons tests conducted at the NTS from 1951 through 1992. A priority of the Underground Test Area (UGTA) project, administered by the Environmental Restoration Division of NNSA/NV, was to determine as accurately as possible a measure of the total radionuclide inventory for calculation of the RST deposited in the subsurface at the Nevada Test Site (NTS). The motivation for the development of a total radionuclide inventory is driven by a need to calculate the amount of radioactivity that will move away from the nuclear test cavities over time, referred to as the hydrologic source term (HST). The HST is a subset of the RST and must be calculated using knowledge of the geochemistry and hydrology of the subsurface environment. This will serve the regulatory process designed to protect human health from exposures to contaminated groundwater. Following the detonation of an underground nuclear test, and depending on the presence of water at the location of the detonation, the residual radionuclides may be found in aqueous or gaseous states, precipitated or chemically sorbed states, or incorporated in melt glass produced by the nuclear test. The decay and in-growth of radionuclides may have geochemical implications for the migration of radionuclides away from underground nuclear test cavities. For example, in the case of a long-lived mobile parent decaying to a shorter-lived and less mobile daughter, the geochemical properties of the parent element may control the migration potential of the daughter nuclide. It becomes important to understand the evolution of the RST in terms of effects on the mobility, solubility, or abundance of radionuclides in the HST that are created by decay and in-growth processes. The total radionuclide inventory and thus the RST changes with time due to radioactive decay. The abundance of a specific radionuclide at any given time is a function of the initial amount of radioactivity, the decay rate and in-growth from parent radionuclides. The in-growth of radioactivity is the additional amount of radioactivity for a given radionuclide that comes from the decay of the parent isotopes. In this report, decay and in-growth of radionuclides from the RST are evaluated over the 1000-year time frame in order to determine whether coupled in-growth and decay affect the relative abundance of any RST radionuclide. In addition, it is also necessary to identify whether any new derivative radionuclides not initially produced by the nuclear test but exist now as a result of in-growth from a parent radionuclide One of the major goals of this report is to simplify the transport modeler's task by pointing out where in-growth is unimportant and where it needs to be considered. The specific goals of this document are to evaluate radionuclide decay chains and provide specific recommendations for incorporating radionuclide daughters of concern in the calculation of the radionuclide inventory.