Which all circles back to the origin of the discussion, that is, from where, and how, does America claim moral authority?
Not to mention the various questionable policies enacted domestically over the years.
My challenge to people is, putting environmental issues aside, name an era when America was a better place than it is right now. A scary concept, but these are the good old days.