Short for "Basic Artificial Intelligence Routine". A product of Exotrope, inc.

BAIR was among the first "artificial intelligence" programs to enter the censorware field. BAIR's idea was simple: It blocked images it saw, and didn't look at the text. It was based on "active information matrix", whatever that is supposed to mean.

Of course, BAIR has no idea what images are "pornographic". Apparently it looked at the percentage of "skin tones" in the picture. It produced false alarms and didn't actually block any pr0n that testers showed them.

The conclusion - as usual - is that this type of image recognition-based blocking of porn is never fool-proof, because machines typically can't tell if image is pornographic. (Don't worry, humans can't always do that either...)

(Sources: Peacefire. See also PORNSweeper.)

Log in or register to write something here or to contact authors.