A mechanical device invented by 20th-century psychologist B.F. Skinner, most often constructed by first-year psychology majors as part of a project to demonstrate the core principles of behaviorism.
In its simplest form, a skinner box is just a fully-enclosed cage, which for the duration of the experiment will serve as the happy/hellish home for a single subject (a small rodent or bird). On one side of the box, a small feeding apparatus is installed, which will provide a positive reenforcer in the form of a food pellet. Alternately, the cage can be wired to produce a small electric shock to serve as negative reenforcement, or the box may be designed to use both methods. Lastly, several simple levers and multi-colored lightbulbs are placed at one end of the cage.
Now comes the tricky part. A sly professor will make the assigned experiment as open-ended as possible, i.e. "Here is a skinner box. Here is a hamster. I'll see you next week." For the sake of this node, let's say that the box contains three levers which activate three different colored lights, and a large lamp placed above the cage.
The object of the exercise is to train the hamster to hit the blue lever every time you turn on the lamp. What do you do?
Well, you have two choices really. One of the important lessons of this whole exercise is to teach you the difference between Pavlov's "classical conditioning" and Skinner's "operant conditioning", and to prove which method is more effective*. (Here's a hint: they named the box after him for a reason.)
Using classical conditioning, the experiment breaks down into a long game of trial-and-error. With enough patience, this approach can still work. After all, if you turn on the lamp, sooner or later the hamster might bump into the right lever and get the reward, gradually creating a conditioned association between the neutral stimulus (hitting the lever) and the unconditioned stimulus (getting the food). As you can guess, though, this is time-consuming, and can lead to other complications... it's relatively easy to associate either the lamp OR the lever with the food, but trying to link those two things together can be very challenging. Students who try this method may end up with a hamster that compusively taps the lever at all hours of the day or night, or a rather obese and sullen rodent that just lays around all day, springing into chaotic action only when the lamp flips on.
A more clever character, however, would have actually read those assigned chapters in the textbook, and discovered the joys of operant conditioning. In this case, the student would flip on the lamp, and then watch closely for a few seconds. If the hamster was to move even an inch in the direction of the right lever, a food pellet would be dropped into the center of cage and the lamp turned back off. Allow the hamster to finish the pellet. Count to five. Flip the light back on. If he veers the right direction again, reward him again. If not, flip the light off, count to five and try again. After a few successful runs, only provide food when the rodent goes two inches toward the right lever, and then four inches, and so on... this technique is called the Successive-Approximations Method, or "shaping", and is highly effective. In a surprisingly short period of time, your hamster will be conditioned to the operant response, and be whizzing around your skinner box like clockwork.
To further impress your instructor, try rearranging the levers and training your rodent to recognize some other discriminative stimulus, instead of just walking in the direction they're trained. Trying to condition it to recognize color is probably futile; IIRC, hamsters are color-blind. However, you could wire the levers to create different noises, training the rodent by sound. There've been advanced projects where a parakeet was trained to play Mary Had A Little Lamb on a miniature piano, using only a skinner box and the methods of operant conditioning.
* to clarify: by "more effective", I simply mean "more effective at getting mice to press levers". Don't confuse this to mean that the operant method is more effective in every way.
As corst has just now explained to me, while Skinner's techniques prove more useful in producing completely new behaviors (that is, mice do not normally go about pressing levers), Pavlovian conditioning is still very useful in other instances; specifically, when the experiment involves rewiring a reflexive behavior that the subject already exhibits on other occasions (i.e. dogs drool, birds peck, etc)