Originally Posted By: the G-man

Some liberals claim to love America, but the only thing they ever do is criticize it. They seem to think that patriotism means ONLY criticism of Bush and/or the direction of the country in general.

Isn't is possible, once in a while, to actually say something positive about the place?