When writing computer programs
of various kinds, a naive
implementation sometimes repeats the same calculations many times. This can have a dramatic effect on speed. Hence one approach that is frequently used is to trade off more memory usage for faster runtimes. How? By calculating certain things and storing the results in arrays
or hash tables
The classic example of this is in computer graphics, where high accuracy values for sine and cosine are not always needed; but calculating trig functions is generally very slow. So sine and cosine angles are precomputed (either at the start of the program, or more commonly, stored in massive include files) to say, one degree or one tenth of a degree accuracy over the range 0 degrees to 359 degrees. When evaluated, a table lookup is performed; or if you really care about it, values can be interpolated from consecutive entries in the table. Another example is lightmaps.