Time-Varying Surface Appearance: Acquisition, Modeling and Rendering

In this project, we take a significant step towards measuring, modeling and rendering time-varying surface appearance. Traditional computer graphics rendering generally assumes that the appearance of surfaces remains static over time. Yet, there are a number of natural processes that cause surface appearance to vary dramatically, such as burning of wood, wetting and drying of rock and fabric, decay of fruit skins, or corrosion and rusting of steel and copper. Our research focuses on these various time-varying surface appearance phenomena. For acqusition, we built the first time-varying surface appearance database of 26 samples, including a variety of natural processes such as burning, drying on smooth and rough surfaces, decay, and corrosion. We also proposed a novel Space-Time Appearance Factorization (STAF) model, which factors space and time-varying effects and thus gives us much more control and editing capability to the original data. The STAF model includes an overall temporal appearance variation characteristic of the specific process, as well as space-dependent textures, rates and offsets, that control the different rates at which different spatial locations evolve, causing spatial patterns on the surface over time. Experimental results show that the model represents a variety of phenomena accurately. Moreover, it enables a number of novel rendering applications, such as transfer of the time-varying effect to a new static surface, control to accelerate time evolution in certain areas, extrapolation beyond the acquired sequence, and texture synthesis of time-varying appearance.


Paper

"Time-varying Surface Appearance: Acquisition, Modeling, and Rendering"
J. Gu, C. Tu, R. Ramamoorthi, P. Belhumeur, W. Matusik and S. K. Nayar,
to appear in ACM SIGGRAPH 2006 PDF.


Images

     DoTSAF: Database of Time-Varying Surface Appearance and Factorization
The Database includes 26 samples, covering a range of phenomena including burning, corrosion, drying on smooth and rough surfaces, and decay. We also include the source code used to process the captured data and estimate the STAF model. For each sample, there are three parts of data to download; 6 of the 1280 measurements of the raw data at each time step, the fit BRDF parameters, and the estimated STAF results. The total size of the database is around 3GB. Due to its enormous size (about 1000 GB), we are not able to release all of the 1280 measurements of the raw data. If you really need it, send an e-mail to staf@cs.columbia.edu
     STAF: Space-Time Appearance and Factorization
This image shows the STAF representation of the drying wood example. The panel above compares STAF to the original sample for one light and view in the acquired dataset to show the accuracy of the model. STAF model can also be used for time normalization, wherein we keep the overall appearance changes but eliminate the spatial drying patterns. The panel below shows the estimated "temporal characterisitic curves" for both diffuse and specular parameters. We also show visualizations of the spatial diffuse "textures" A, D, R, and O, along with the normalized initial frame A*f(0)+D and final frame A*f(1)+D. Please refer to the paper for more details.
     STAF for More Samples
This image shows the estimation of STAF for 5 samples in the database covering a range of different phenomena, each of which has 3 spatial locations marked. The middle row shows the time-varying curves p(x,y,t) (for the red diffuse component) for spatial locations A, B and C. The curves are quite different for different points A, B and C. In the bottom row, we align these time-varying curves using STAF model. The data accurately matches the temporal characteristic curve f(t') computed from all the points on the sample.
     Rendering Examples
This image shows various rendering examples in the paper. The video for these examples are referred to the video below. STAF model enables us to manipulate the acquired data with much more flexibility. We can synthesize the time-varying texture by synthesizing the initial and final frames using standard 2D texture synthesis method. We can extrapolate the time to some extend of the original data. We can control the rate and offset terms according to some physical principles or other predefined rules. We also can transfer the A and D terms for new materials to create time-varying effects.


Video

     SIGGRAPH 2006 Video (*.avi, 45 MB)
This video shows the pipe line of our study in time-varying surface appearance, including acqusition, modeling, and rendering. A higher quality *.mov version of this video can be downloaded here (64 MB).
     Some Samples in the Database (*.avi, 20 MB)
This video shows the acquired data of 6 samples in our database under a fixed lighting and view direction (Light 80 and view 07).
     Rendering Examples (*.avi, 9 MB)
This video shows 5 rendering examples in the paper including texture synthesis, time-extrapolation, control, and transfer. A higher quality *.mov version of the video can be downloaded here (48 MB).
     More Synthesis Examples (*.avi, 4 MB)
This video shows texture syntheis examples for the drying rock and the drying wood, which shows the drying patterns more obviously.