Misinterpretations in the Modeling of Contaminant Desorption from Environmental Solids When Equilibrium Conditions Are Not Fully Understood

Abstract
For systems of sediments, soils, and subsurface solids that involve aggregations of fine-grained materials and zones of immobile water, the desorption of organic contaminants is often controlled by aqueous diffusion within the immobile water of a sorption domain. Accurate modeling in such systems requires not only a rational conceptual model for rates, but also good understanding of the equilibrium condition and of the initial conditions that existed at the onset of desorption. In this work, numerical modeling was used to obtain synthetic experimental results with a hypothetical (yet realistic) system in which rates were controlled by sorption-retarded diffusion. Batch sorption and desorption experiments are simulated at a variety of time scales and the results interpreted through modeling. Results show that even 50 days can be far too short to obtain equilibrium isotherms, and that the associated isotherm interpretations cause incorrect kinetic interpretations and predictions. In particular, 4- week batch sorption/desorption rate experiments could be well described using any of the presumed isotherms, but these results lead to greatly overpredicted rates of desorption under longer term conditions more relevant to field remediation. Additional results were also generated to quantitatively illustrate how misunderstanding of sorption equilibrium and diffusion rate will lead to incorrect suppositions of desorption hysteresis.