America as the "Force for Good"

Got this in an email. Thoughts? [youtube=]

Is America the force for good in our world?

Should it be the church instead?

Auschwitz was rescued by military forces, not peaceful forces. Is that a good thing? Would peace have ever prevailed?

What does it mean to be American in today's world?

Are we the same country with the same values that we had originally?