Posted by Reid: Have you noticed, or is it just me, that as America has changed we have become a weaker country?
I took some time over the weekend examining America. I came to "not-so=shocking " conclusions. First in the decades from 1900 through 1959 this country put country first. Regardless of ideology, race, sex, etc, we were Americans. We would defend that to the last man. Compared to now, where we live with dissidents amongst us. Many "citizens" hate this country and what it stands for! Why? As we entered the Vietnam era, our mental approach to the world changed. No longer were we a solid focused amalgamation of people, we were divided without a way to bridge the gaps. This has done nothing but to have gotten worse. Now our ideological gaps appear bigger than the solidarity of Americanism.
Today we are even completely split on our President. I have seen many men elected to this office, and all managed to somehow encircle most of the country after the election. We have always had some polar radicals that are against the power party, but that fringe was always a very minor group. It appears today that the media and the "inner circle " of the leadership is intent on keeping the country in the dark and to "tell" us what is our choice. I am scared to death!
Wake up people. look around you at the worst economic times in modern history since the great depression . Look at at polarization of this once great, proud country. Look at the racial issues; look at the religious issues; look at the political issues. Now stop and remember that we are all Americans. I , for one think it is high time that we start acting like Americans. Lets get this petty crap behind us work towards re-building this country on its founded principals. Quit fighting about same sex marriage, etc., and start figuring out how to re-build our manufacturing; how to re-become the worlds producer; how to become, again, the dominant country in the world. If we don't hurry it may be to late.