A mechanical device invented by 20th-century psychologist B.F. Skinner, most often constructed by first-year psychology majors as part of a project to demonstrate the core principles of behaviorism.

Construction

In its simplest form, a skinner box is just a fully-enclosed cage, which for the duration of the experiment will serve as the happy/hellish home for a single subject (a small rodent or bird). On one side of the box, a small feeding apparatus is installed, which will provide a positive reenforcer in the form of a food pellet. Alternately, the cage can be wired to produce a small electric shock to serve as negative reenforcement, or the box may be designed to use both methods. Lastly, several simple levers and multi-colored lightbulbs are placed at one end of the cage.

Operation

Now comes the tricky part. A sly professor will make the assigned experiment as open-ended as possible, i.e. "Here is a skinner box. Here is a hamster. I'll see you next week." For the sake of this node, let's say that the box contains three levers which activate three different colored lights, and a large lamp placed above the cage.

The object of the exercise is to train the hamster to hit the blue lever every time you turn on the lamp. What do you do?

Well, you have two choices really. One of the important lessons of this whole exercise is to teach you the difference between Pavlov's "classical conditioning" and Skinner's "operant conditioning", and to prove which method is more effective*. (Here's a hint: they named the box after him for a reason.)

Using classical conditioning, the experiment breaks down into a long game of trial-and-error. With enough patience, this approach can still work. After all, if you turn on the lamp, sooner or later the hamster might bump into the right lever and get the reward, gradually creating a conditioned association between the neutral stimulus (hitting the lever) and the unconditioned stimulus (getting the food). As you can guess, though, this is time-consuming, and can lead to other complications... it's relatively easy to associate either the lamp OR the lever with the food, but trying to link those two things together can be very challenging. Students who try this method may end up with a hamster that compusively taps the lever at all hours of the day or night, or a rather obese and sullen rodent that just lays around all day, springing into chaotic action only when the lamp flips on.

A more clever character, however, would have actually read those assigned chapters in the textbook, and discovered the joys of operant conditioning. In this case, the student would flip on the lamp, and then watch closely for a few seconds. If the hamster was to move even an inch in the direction of the right lever, a food pellet would be dropped into the center of cage and the lamp turned back off. Allow the hamster to finish the pellet. Count to five. Flip the light back on. If he veers the right direction again, reward him again. If not, flip the light off, count to five and try again. After a few successful runs, only provide food when the rodent goes two inches toward the right lever, and then four inches, and so on... this technique is called the Successive-Approximations Method, or "shaping", and is highly effective. In a surprisingly short period of time, your hamster will be conditioned to the operant response, and be whizzing around your skinner box like clockwork.

To further impress your instructor, try rearranging the levers and training your rodent to recognize some other discriminative stimulus, instead of just walking in the direction they're trained. Trying to condition it to recognize color is probably futile; IIRC, hamsters are color-blind. However, you could wire the levers to create different noises, training the rodent by sound. There've been advanced projects where a parakeet was trained to play Mary Had A Little Lamb on a miniature piano, using only a skinner box and the methods of operant conditioning.



* to clarify: by "more effective", I simply mean "more effective at getting mice to press levers". Don't confuse this to mean that the operant method is more effective in every way.

As corst has just now explained to me, while Skinner's techniques prove more useful in producing completely new behaviors (that is, mice do not normally go about pressing levers), Pavlovian conditioning is still very useful in other instances; specifically, when the experiment involves rewiring a reflexive behavior that the subject already exhibits on other occasions (i.e. dogs drool, birds peck, etc)

The name 'Skinner box' is a flippant alternative to B.F. Skinner's phrase 'operant chamber.'

Before the box, the prevailing approach to the study of animal behavior was the rat maze, which required the experimenter to move the animal many times, which usually got it excited and messed with the data. Also, the rat couldn't itself set how often it ran the maze. The box is called a free operant apparatus because, unlike those earlier methods, the animal can work the operanda (e.g., the lever) whenever it likes. This technique allows different topics to be treated, and Skinner explored a lot of them (for example, pioneering the study of schedules of reinforcement.) Another advantage is that the box is relatively standardized, which means it is easier to compare data across different boxes. Also, the box has been tuned, eliminating many annoying bugs. For example, in early versions the levers were not rounded, which consistently caused the rats to chew on them (hence closing the switch over and over again.) Skinner wasn't interested in studying this chewing, so he rounded the lever. Yet another advantage of the operant box is that it doesn't require you to code behavior, which takes forever and sometimes yields dubious data - with the operant box you just record responses. (This is also a weakness - like the earlier maze methods, you don't see that much about what is going on.)

But the most important advantage the box has is that it's really convenient. When you are actually running an experiment, you can set up your experiment to run while you're away, put the animals in the boxes and go have lunch instead of standing around and making them nervous. Before we had computers to record the responses, Skinner's invention of the cumulative recorder - a rotating drum with paper on it, and a pen which ticked up the paper every time the animal responded - made data collection an equally unattended process. If you need to collect days of data for hundreds of animals, this is a complete godsend. Too bad that not all kinds of research can be this convenient. Whatever you think of Skinner, he invented some useful lab gadgets, and many of them are part of the Skinner box.

Is the Skinner box hellish for the rat? Rats get nervous in the open, not in enclosed spaces - let one go and he will dart for shelter. The typical box is plenty big enough for the rat to walk around - it's not as if he's being kept in restraint. Besides, lots of people have pet rats which get morbidly obese in their cages anyway, doing nothing but eating and sitting around. I think playing an odd sort of game with levers is a lot better than that.

For a student not focused on animal behavior, the point of putting an animal in a Skinner box and watching it work is so you can learn something about the animal's behavior by watching and interacting with it. It's much more accurate and detailed than reading some oversimplified textbook treatment. Here you have the thing itself instead of someone's opinion. Here, even if you are the most skeptical person ever about behaviorism, you think everything they say is full of crap, you can put that to the test. You can see for yourself, and if you make a clever experiment then you can convince other people too. More likely you just get a taste of how much is there to be seen, how much you can do with the method, but also how little you know about why the rat does what it does.

The procedures "classical conditioning" and "operant conditioning" do completely different things, so it is not really an answerable question whether one works better than the other. It's like comparing a hammer and a screwdriver. Classical conditioning is glossed as the animal predicting and is effectively defined physiologically, while operant conditioning is glossed as the animal acting on its environment to produce an effect, and is defined much more broadly in terms of equivalence classes. (For example, we are concerned with when the rat presses the lever - but not whether he uses the right or left paw.)

If you think the process is merely trial and error, you should try making a robot which doesn't have any notion of how to press a lever built into it, but which can learn to do so from experience. Operant conditioning is easy for the experimenter because the animal is doing all of the heavy inductive lifting.

Log in or register to write something here or to contact authors.