I was thinking about this tonight. I'm in my mid 20's, working on my PhD. When I'm done and get married/have a kid or three I was trying to reason out what I would do about Santa, Easter Bunny, Tooth Fairy, etc. for my kids. On the one hand, I would hate to lie to my children, in any way shape or form. For the most part my parents were almost always honest and rational with me, and I think I turned out great. On the other hand, what would my children have to go through with their friends/school talking about santa and presents and stuff. Would they feel isolated and different? (would get presents from us, not from santa) They usually find out the truth at a young enough age to not turn it into a trust issue. And stupid America forces us to do these things so we buy more stuff. If I was in another country that didn't have things like this so ingrained into youth society I wouldn't care. What do you guys think? Even something like Santa. They'll probably learn about it from tv/movies/friends at school and so forth. Do you tell them the truth as soon as possible? Do you play along with it? I just want to see what other people (mainly Americans, since its different here than elsewhere from what I've heard) think.