Originally posted by ziron
religion is about fairy tales that never happened and is ultimatly a lie Education on the other hand teaches the real and true.
Just because you don't believe it, why should they take it from the other people who do? What are we if we aren't a nation 'under god'? A nation under Bush? Or any other person who steps in every four-eight years? Hardly. The phrase 'under god' really isn't hurting anyone. You are not forced to say it, stay seated if you feel it is just a flag. You can opt out from saying it if you disagree with it. But if they take it out, what do the people who do the people who beleive have?