Logistic Regression (LOGIT) is a statistical technique for the analysis of dichotomous dependent variables. It is largely analagous to Ordinary Least Squares regression (OLS), except for the critical distribution of the dependent variable.

Because the dependent variable is dichotomous, OLS can yield results that are outside the range of the values of the dependent variable. Example: the dependent variable is yes or no, operationalized as 1 and 0. OLS can generate predicted values greater than 1 or less than 0, which makes no sense. In order to analyze this kind of data, LOGIT converts the depenent variable into a variable wich is ln(p/1-p), or, the log of the odds of the value registering a value of 1. The beta coefficients, which show the relationship between the independent and dependent variables, are directly interpreted as the unit change in the log of the odds given a one unit change in the independent variable. Using some rather simple mathermatical transformations, one can then generate the probability of a given case achieving the value of 1.

The formula for LOGIT looks like this:

ln(p/1-p)= b1x1 + b2x2 + b3x3...bnxn + e. where bi is the coefficient associated with independent variable xi, and e is the error term.

Respectfully,

Dogboy.