Truth, justice and the American way.
That is what we all used to be fighting for.
Truth - a person used to be innocent until proven guilty by a court of law in America. Is that still the case? Do we really seek truth? Or, are we blinded by ideology and bound by political correctness?
Justice - what a joke. The head of our justice department is in contempt of congress for breaking the law. The IRS tortures political opponents of the president. Is that justice?
The American Way - do we even know what that is anymore? We used to know that evil existed and we could all join together and fight against it. But, we can't even call evil out anymore. It might "offend" someone. Can anyone grasp that the people that liberals and the PC police are so afraid of offending are slaughtering people all over the world?
Yes, I understand that liberals believe that we deserve it because America is inherently bad. That is untrue, of course. And, the fact that liberals hate America isn't going to stop terrorists from killing them, because liberals are still Americans who are enjoying compete freedom.
There is a news clip that I keep seeing of men holding up machine guns and cheering, all of their faces are covered.
Liberals call this a religion of peace.
What has happened to us?
I want Captain America back.