Two Adaptive Techniques Let Progressive Refinement Outperform the Traditional Radiosity Algorithm
- 1 January 1989
- report
- Published by Defense Technical Information Center (DTIC)
Abstract
Two adaptive techniques applied to form factor calculation in the progressive refinement version of radiosity allow it to compute the final converged result quicker than the traditional, matrix-solution-based, radiosity algorithm. Adaptive components of our algorithm are: 1. adaptive subdivision of the hemi-cube/sphere as a function of delta form factor distribution; 2. adaptive reduction as a proportion of unshot radiosity in progressive refinement iterations; and 3. switching from a hemi-cube Z-buffer to ray sampling at the appropriate point in the computation to optimize efficiency. The key idea is that of importance sampling. The reasoning is similar to that used to eliminate unnecessary samples in traditional ray tracing and obtain the path tracing algorithm. The subdivision pattern and resolution of the hemi-cube are determined adaptively to keep the energy carried by each hemi-cube sample constant. This also reduces the error variance of the hemi-cube samples. Experimental evidence suggests that te progressive refinement approach must calculate the number of form factor as the traditional method before the process has converged. If we assume that form factor calculations consume over 90% of the computation, this means the progressive refinement approach takes 2x as long to calculate the fully converged solution. Use of the above adaptive techniques can yield a 7-fold speedup of the progressive refinement approach, with the same error restrictions on the image, on data sets ranging from a couple hundred patches to thousands of patches. Thus the adaptive progressive refinement approach can compute a fully converged image 3-4 x quicker than the radiosity algorithm. (jhd)Keywords
This publication has 0 references indexed in Scilit: