Growing up I never believed in Santa.
Why? Because my parents never told me about Santa....or the Easter Bunny, or the ToothFairy. Does this mean I had a deprived childhood and my memories of Christmas, Easter and any other holiday are horrible?
Uh...no. It means my parents didn't lie to me.
Just like I don't plan to lie to my children.
Does that make me a bad Mom?
Will my children grow up to hate me?
Who knows? If I am a bad Mom..I dont think it's b/c my children will know that I buy them the gifts they get...not some fictional guy.
So many parents force their kids to sit on Santa's lap...and the kid is SCREAMING! Then they bribe the kids at Christmas time to be good "Or Santa won't get you anything" Ummm...hello??? Aren't YOU the parent??? Why is this Santa "helping" raise your blessing??? Only to tell the kids when they are older "Oh yeah, that guy?? Not real"
I've always felt that if you tell your child about Santa, some guy they can't see cuz he lives at the North Pole...and also tell them about God, who lives in Heaven and you can't see with your eyes. One day your child will find out that all these years you've lied to them...there IS no Santa. So why would they believe that there is a God?
What are your thoughts? Did you believe in Santa? Will you tell your children there is a Santa?