Friday, August 12, 2011

Does it seem odd that with Obama in the White House left wing Americans have become more down on the US?

Their vitriol toward the US has become even more rancid, they say horrible things about it at home and abroad, and they are the ones who seeking to move overseas. Anyone else find that strange? Or perhaps, considering the hypocrisy of the left, it's expected.

No comments:

Post a Comment