So I was on one of Jay's threads and it had prompted me to ask the question:
What are your perceptions about the United States? So many times I watch our President on TV spitting off at the mouth about whatever and listening to the decisions he has made for our country and I wonder, what does the news say in other countries? What are Canadians, English, Japanese, Italians, Mexicans, Spaniards, thinking about our country right now?
We are one of the biggest world powers and I wonder if it will stay that way. You hear so many Americans say they went to France and were treated like poo and the French snickered about them while visiting thier country. Apparently the decisions our leaders make for us, define what the world thinks about us.
Even if you are in the US, I would like to hear your thoughts.