Because they always know more than they should, or more than you did at their age. I'm not saying that you have to tell them about every little inappropriate thing in the book, but at least educate them on the various words and actions that are unacceptable in society. I guess my parents didn't do a very good job of this considering the story of what happened to me at my first day of kindergarten (see: The best way to start the school year). I had no idea that it was wrong to show someone your middle finger. A similar thing happened to me when I overheard a movie my parents were watching where I heard the word "fuck" as in "fucking" used as an adjective. My cousin asked me if I wanted to play soccer, and eager to employ my latest advance in vocabulary, I answered, "I don't want to play that fuckin' game!" My cousin's eyes widened as he ran to tell our moms while I'm sitting here thinking, "I didn't say a bad word, did I?" I was soon set straight after crying my eyes out because I thought that Santa wouldn't come because I'd said a horrible word (it was Christmas Eve).

Maybe my parents just assumed that I knew all of this, who knows? They also gave me an entire sex talk that lasted two hours when I was eleven without actually telling me what sex consisted of. I thought it was something that happened when you made out on a bed, like in a soap opera. Kids have got to know this stuff, otherwise they face alienation in childhood which can lead to school shootings in adolescence due to the emotional scarring. Luckily I don't scar that easily. :)

Log in or register to write something here or to contact authors.