I don't understand why we need symbols to live out what we believe in. Yes, I know everyone has the right to self-expression in this country, but it seems to supersede any sense of unity in these UNITED States.
Why do some need a flag to say they're proud of being from the south? The north, east and west don't have one. Won't you still be a southerner without a flag? Why insist on something that hurts other fellow Americans? Won't you still be free with or without a flag? If it's still the south against the north than we haven't made any progress in over two hundred years.
I don't need a flag, a bumper sticker, a motto, the name of God in documents or even a church to live out what I believe in - and I certainly don't need a stranger to wish me a Merry Christmas to celebrate Christmas with meaning and joy.
What would happen if everyone stopped fighting about the SYMBOLS of our freedom and beliefs and just enjoyed having them in this country? We keep saying America is so great but all we do is defend our right to offend each other.
Other countries with much less freedom than we have probably find our petty arguments laughable.
Even though we have the RIGHT to express ourselves - why is it more important than proving that we are in fact UNITED in any way?