Originally Posted By: the G-man

Some liberals claim to love America, but the only thing they ever do is criticize it. They seem to think that patriotism means ONLY criticism of Bush and/or the direction of the country in general.

Isn't is possible, once in a while, to actually say something positive about the place?


 Originally Posted By: Friendly Neighborhood Ray-man

maybe if i thought the comments made by wright were out of line or wrong i would care.


Guess not.