Summary
The combination of advances in hydraulic-fracturing and horizontal-drilling technologies has led to a resurgence in oil and gas activity in multiple regions across the United States. To minimize the impact of the increased water use, many oil and gas companies are pursuing fracture-flowback-water and produced-water recycling for subsequent drilling and fracturing operations. Common processes in a recycling strategy include metal precipitation to minimize scaling potential and electrocoagulation for removing solids that may foul a well when water is reused.
In this study, the precipitation of calcium, magnesium, barium, and strontium was examined experimentally by adding target ligands followed by solids separation with electrocoagulation. In addition, removal efficacy was modeled with a commercially available chemical-equilibrium software (OLI Systems). The experimental results were compared with the predicted data at pH values of 9.5 and 10.2. The differences between the modeled data and the experimental data indicated a deficiency in the solid/liquid-separation process in the laboratory. Results also showed that the pH value did not affect the treatment efficiency, except in the case of magnesium; however, sequencing of softening relative to coagulation was important. An additional finding was that, on the basis of the target metal, either sulfate or carbonate needs to be greater than a threshold concentration to achieve precipitation goals. Chemical consumption at pH values of 9.5 and 10.2 was significantly different. Chemical-equilibrium modeling predicts that the average base usage is 30% lower at a pH of 9.5 compared with a pH of 10.2 and 34% lower for acid usage. The reduction in use experimentally was 27% for base and 43% for acid.
Introduction
Unconventional oil and gas development has been increasing rapidly throughout the United States, largely because of technical advancements in directional-drilling and hydraulic-fracturing techniques. Shale-oil and -gas production (one form of unconventional development) requires large volumes of water for hydraulic fracturing, and much of this activity occurs in areas of the country that are prone to drought and water shortages (Gregory et al. 2011).
The concurrence of large water requirements and water-stressed regions has led to significant interest in reuse of the water that is returned during oil and gas production, commonly referred to as fracture-flowback water and produced water (Fakhru’l-Razi et al. 2009). Historically, water coproduced with oil and gas (produced water) has been disposed of through evaporation or deep-well injection, approaches that do not conserve the resource for beneficial reuse. Reuse of flowback and produced water currently varies significantly from region to region and even within the same oil and gas basin. For example, recycled produced water is less than 10% of the total water used to drill and fracture in the Barnett, Fayetteville, and Haynesville shale plays. However, the fraction of water recycled is significantly higher in the Marcellus play—greater than 90% of the total water used (Mantell 2011).
Although treatment methods have been developed to recycle produced water for subsequent fracturing operations, widespread adoption of these methods is often limited by costs. Important treatment aspects for fracturing-water reuse include particle removal, reduction of scale-forming metals, and disinfection. Removal of total dissolved solids (TDS) is expensive and therefore avoided, if possible. Fracturing fluids have been developed that are compatible with high TDS concentrations, but the other objectives (solids reduction, scale control, and bactericide) almost always need to be satisfied. The focus of this study was to examine the metal-removal processes associated with reducing scaling potential by use of laboratory-scale data and chemical-equilibrium modeling, with the goal of optimizing chemical use and minimizing cost.