Is America really the greatest nation?

Day 1,244, 11:30 Published in USA USA by C.E.O

America is the greatest country on Earth. Or so we're repeatedly told from our very earliest consciousness. In school, by our parents, by our elders in general, by anyone and everyone, from pretty much the time we are born. We recite the Pledge of Allegiance everyday in school. We celebrate the Fourth of July as something of epic triumph. Although I’m not saying I completely disagree, I have for the last decade or so been wondering why. What is it that makes America so great? Whenever I ask this question in America, I can pretty reliably count on it threatening my health and well being. But I seriously want to know, what is it that makes this country so much better than any other? I am earnestly welcoming reader responses here, but first, let me expand my thoughts a little.

Of course America is a much preferable place to live than great swathes of the globe. It’s light years beyond starving in a developing country or suffering under the iron rule of a repressive regime. So it’s easy to discount pretty much two thirds of the planet as a worse place to live than America. Yet this still doesn’t define America as the best in the world. Most of western Europe lives within the confines of excessive luxury just as we do. Freedom, I find, is a common answer as to what makes America shine. Sure, of course, but again, if you’ve been to western Europe, these days they’re pretty much singing the same song. There are nuances of course, England has stricter sedition laws and therefore a slightly more restricted press, but for day to day living, these types of things are often purely academic. We used to be able to say jobs and quality of life, but our unemployment is similar to much of the bigger western European countries, and quality of life is highly debatable. My biggest fear I suppose is that the real reason that Americans consider America great has nothing to do with the present but is wrapped up in a history where we once were great.

For the 19th and most of the 20th century, America was a haven from the turmoils of Europe and the rest of the world. High unemployment, a rigid caste system, and constant political unrest sent people by the boatloads into our harbors. America really was a truly better alternative. In the 20th century, Europe and Asia were decimated by massive wars and extended recovery, thus America continued to be an idyllic port in the storm. Eventually though, Europe recovered from two World Wars and, throughout the 90’s and into the millennium, produced a staggeringly high quality of life. Populations of countries like Norway, Switzerland, England, Germany, and France enjoyed pretty much everything that America did, governments comprised of elected officials, high degrees of technological innovation, social and political stability, and a general existential sense that life is much more than just mere survival. On every tangible level, America and Europe were merely different translations on the same thing. Europe tends to pay higher taxes, but America pays in terms of reckless abuse by our less regulated corporate culture. What I’m saying is that amidst all of this, most Americans currently define America as great because of its history and not because of its present.

What America still is is a great idea. But what does that really mean anymore? Thanks I suppose to a combination of the human condition and the law of averages, America’s great idea of freedom and possibility has been compromised by an increasingly rigid class system where the rich enjoy freedom and the rest of us scrap for the few remaining crumbs. By turns the rigid class system of old Europe has largely deteriorated bringing the level of freedom and opportunity of its citizenry up to about equal with America. So in pure, present minded terms, America offers no real advantage.

I often get the argument that men and women have fought and died for this country, which is what makes America great. If there is a more appreciative person towards our armed services than me, I haven’t met them. I deeply value their service and commitment, but I think there is a philosophical misunderstanding here of what our veterans have done. They are defending our home, they aren’t necessarily defending the idea that we are just plain better. A home is absolutely a worthwhile thing to defend. We after all do value the things we have and would prefer that they weren’t taken away. But why does the fact that we are willing to defend the things we have make our things intrinsically better? All it means is that we like our things.

In a recent survey by the Organization for Economic Co-operation and Development the United States didn’t even crack the top ten in terms of happiest countries. Northern Europe grabbed the first three spots and Canada and New Zealand were the only ones in the top ten outside of Europe. Too subjective for you? How about this, America ranks 37th in quality of healthcare according to the World Health Organization, yet we spend the most on it. France, by the way is number one. The thing about this is, physical health is a massive determinant in terms of over all happiness and Americans are physically unhealthy. How about unemployment? Within the developed world, our numbers are mediocre at best. According to the IMF, in 2010 so far we are being beat by Austria, Canada, Australia, New Zealand, Singapore, and pretty much most industrialized countries. So that means as far as physical health and fiscal security, we are clearly not the best. By most quantifiable standards then, in 2010 America is not the best country to live in. As far as I can tell, we’re only holding on to the belief that we are because we have the most shit, although we don’t actually have it, we merely possess the most shit which was purchased on credit.

I’m not pointing all of this out to denigrate America. I really do just want the question answered, why if we’ve been led to believe it our whole lives, is America the best? A few years ago I spent time in Southern France and loved it. I desperately wanted to move there. When I told people this, I was crucified as an America hater. Slowly I began to convince myself that I was wrong, America was a better place to be. But as I look around and see how miserable we all are, as I battle financial setbacks due to corporate greed, a broken healthcare system, and a culture that created it, I’m starting to wonder if I was right the first time. We’re indoctrinated from our youth to believe that America is the best country on Earth, but I’ve yet to hear a good answer to support this. I really do invite any and all readers to post their comments. I want to know if and why you think America is the greatest country. I don’t doubt that I’m just missing something here, it wouldn’t be the first time, but I’ve yet to be swayed. So how about it, let’s get this discussion going.