Abstract
A numerical model for simulating flows in either looped or dendritic channel networks is presented. The solution procedure solves the full nonlinear gradually varied unsteady flow equations using the generalized Newton‐Raphson technique. A sparse matrix technique is used to store and solve the resulting set of linear equations that is solved to find the flow corrections during the simulation; use of this matrix technique allows the computer storage to be substantially reduced. Analytic differentiation is used to evaluate the partial derivative terms of the linear flow correction equations and this type of differentiation has permitted significant improvements in the computational efficiency of the model.

This publication has 7 references indexed in Scilit: