Monday, August 24, 2009

Who Won the Culture Wars?

You keep hearing (a little less, now) about how we are a "center-right" country. In 2004, I might have thought we were a "right-wing-nutjob country" having just re-elected a simpleton to four additional years of havoc.

But now that "right"-ness seems to have been just the veneer many of us might have suspected. With a thoroughly new kind of man in the White House and the GOP relegated to hillbilly country, we may be looking at a more accurate picture of a liberal-minded country that had been in hiding.

The Culture Wars was always a right-wing idea. For those on the other side of it, this was just a matter of common sense: should men and women of all colors and creeds actually be taken seriously? What about the evidence that science lays out, making no promises but delivering results over and over? How about the freedom to love and to marry as you see fit (Prop 8 to the contrary) without fear of persecution? And there's music. Funny how even the most reactionary talk-show hosts seem to have a thing for the Devil's Own Noise--rock and roll.

Is there really any question who won the culture wars? Even Sarah Palin felt the need to admit she had a "gay friend" or two. Would this have been remotely thinkable before the 1960s?

Religiosity, bigotry and stupidity--always on the wrong side of the culture wars--are in retreat. The liberals had long ago won, but had not been given credit. The evidence is in, and we are not--and have not been for a long time--a center-right country.

Let's declare the culture wars over. We have more important things to do.

--Renaissance
Saturday, November 15, 2008