Conservatism. What good has conservatism done for the U.S. in the past 100 years?
Conservatives extoll the virtues of being conservative but I simply don't see any evidence that conservativeness has made this country any better. Most of the social improvements in this country are the results of progressives and liberals (women's rights, gay rights, worker's rights, rights for minorities, consumer rights, etc). So, what has been done to benefit this country in the name of conservativeness?
I don't know what the hell they think they're conserving. They're attempting to conserve wealth at the top, but they're going to fuck even that up because there's no longer an economic situation where they can produce low and sell high. The days of cheap China are long over.