In

statistical or

mathematical terms, the jackknife is a

method of determining the behaviour of a

model or a

distribution through

simulation. It is one of the

Monte Carlo methods of simulation, along with

permutation and

bootstrapping.

Using a real data series, jackknifing allows a researcher to create a large number of hypothetical cases by resampling the data by eliminating one (or more) case at a time. It is used principally to determine the effect outliers or abberrant cases have on the results of the study. The fundamental procedure is as follows:

- Identify the data series of interest. This data series will have
*n* observations.
- Calculate the parameter(s) of interest on the original data.
- Eliminate the
*1st* observation from the data series.
- Calculate the parameter(s) of interest on the reduced data set.
- Repeat steps
**3-4** for the *2*^{nd} to *n*^{th} observation.
- If desired, repeat steps
**3-5**, eliminating more than one case at a time.

This method of simulation is particularly useful when the data being studied do not conform to typical distributions, or may contain a lot of

uncertainty.

**Update:** ariels tells me that this is called

leave-one-out cross-validation by

computer scientists, but that the procedure is not quite. Leave-one-out cross validation is used to estimate the error probabilites, while jackknifing can be used to estimate many different parameters or even distributions.