A new criterion is introduced for comparing the convergence properties of variable metric algorithms, focusing on stepwise descent properties. This criterion is a bound on the rate of decrease in the function value at each iterative step (single-step convergence rate). Using this criterion as a basis for algorithm development leads to the introduction of variable coefficients to rescale the objective function at each iteration, and, correspondingly, to a new class of variable metric algorithms. Effective scaling can be implemented by restricting the parameters in a two-parameter family of variable metric algorithms. Conditions are derived for these parameters that guarantee monotonic improvement in the single-step convergence rate. These conditions are obtained by analyzing the eigenvalue structure of the associated inverse Hessian approximations.