There has been some discussion recently regarding how ID's Quake achieves the wavy "underwater" effect. This page illustrates a number of ways to make wavy images.
For starters, here is an example of the underwater effect:
This image is a brightened version of a Quake screen shot. Quake did the warping, I did the brightening. You can find this image, and more like it, at http://www.stomped.com/, the site where I got the two Quake screen shots for this web page.
Lots of people have offered their best guesses at how Quake handles the "underwater" effect. Some people have talked about shifting scanlines in a sinusoidal pattern, either vertically, horizontally, or both. Others have discussed using lookup tables to do the mapping, with the mapping varying depending on who you talk to. Below, I illustrate a number of mappings, some of which correspond to shifting scanlines, some of which would require two-dimensional lookup tables to be efficient. I start with two source images and then I warp them in a number of ways. You be the judge of which looks closest to the actual Quake image, above. At the end, I explain how I think Quake creates the wave effect.
Here are the two source images which I use as test cases for image warping. The left image is a simple checkerboard pattern, and the right image is a brightened version of an un-warped Quake screen shot:
I call the first method of warping the images "Horizontal and Vertical Stretching". The name is descriptive of the kind of warping visible in the resulting image. This first method of warping, which was recently posted to net news, uses the following source-to-destination mapping:
xo = round(8.0 * sin(2.0 * PI * x / 128.0)); yo = round(8.0 * cos(2.0 * PI * y / 128.0)); dest[y][x] = src[(y + yo + rows) % rows][(x + xo + cols) % cols];
With this method, the x offset
xo is dependent on the x source
x, and the y offset
yo is dependent on
the y source coordinate
y. This has the effect of stretching
and squashing the source images both horizontally and vertically. Note,
however, that vertical and horizontal lines remain straight. The stretched
and squashed areas in the warped images are arranged on a checkered grid
with a spacing of 128.
Here are the results for the first method:
In retrospect, I should have used a finer checkerboard pattern for the left image. The vertical stretching is subtle, and the horizontal stretching can't be seen because stretching and squashing is done at the boundaries between checkerboard squares.
I call the second method of warping the images "Two-Way Sinusoidal Warping". This method uses the following source-to-destination mapping:
xo = round(8.0 * sin(2.0 * PI * y / 128.0)); yo = round(8.0 * cos(2.0 * PI * x / 128.0)); dest[y][x] = src[(y + yo + rows) % rows][(x + xo + cols) % cols];
This method is different from the previous method in that, with this
method, the x offset
xo is dependent on the y source
y, not the x source coordinate
and likewise with the y offset
yo. This has the effect of
warping the images horizontally and vertically in a sinusoidal pattern, with
horizontal and vertical lines becoming sinusoids with periods of 128
Here are the results of the second method:
Note that these images look suspiciously similar to actual warped Quake images...
A third method of warping images, which I call "One-Way Sinusoidal Warping", is similar to the second method; however, this third method only tweaks the images in one direction. This method can be carried out by shifting scanlines by varying amounts. Here is the mapping which only shifts images horizontally:
xo = round(8.0 * sin(2.0 * PI * y / 128.0)); dest[y][x] = src[y][(x + xo + cols) % cols];
This mapping corresponds to shifting horizontal scanlines left and right. Using it, I generated the following images:
Here is the mapping which only shifts images vertically:
xo = x; yo = round(8.0 * cos(2.0 * PI * x / 128.0)); dest[y][x] = src[(y + yo + rows) % rows][(x + xo + cols) % cols];
This mapping corresponds to shifting vertical scanlines up and down. The following images were generated:
Quake definitely shifts the image both horizontally and vertically, similar to Method II, above. I presume that Quake uses sinusoidal offsets stored in a lookup table. I haven't been able to determine whether Quake does the warping in one pass or in two. Either would be possible. My best guess is that the process uses two steps. Using two steps is simpler, and it doesn't take much extra time if the engine draws to an off-screen buffer before it does a dumps the pixels to the screen. Offsetting in one direction is performed when drawing scanlines to the off-screen buffer, offsetting in the other direction is performed when dumping lines from the off-screen buffer to the on-screen buffer.
The total amount of offset used by Quake is small, and thus the extra number of pixels required is small. I estimate that, for a 320 by 200 screen, the width of the border of "extra pixels" required at the edges of the screen is around two pixels. At most, a border of four extra pixels is required.
The animation of the wave effect is done by adding a time term to the sinusoids. Here are the resulting equations for the offsets:
xo = round(H_AMPL * sin(2.0 * PI * (y / V_PERIOD + t / T_PERIOD))); yo = round(V_AMPL * cos(2.0 * PI * (x / H_PERIOD + t / T_PERIOD)));
In the above equations, I have labelled the horizontal and vertical
V_AMPL, the horizontal
and vertical (spatial) repetition periods as
V_PERIOD, and the time (temporal) repetition period as
Note that applying a time-varying (increasing or decreasing) offset to an index used to reference a lookup table of sinusoidal offsets gives the same effect as including a time term in the actual sinusoidal equations.
If you have any comments on this stuff, or if you want to discuss other techniques used in 3D game engines, feel free to send me mail at the address below...
Last updated by Kekoa Proudfoot on March 25, 1996.