"...a belief that the culture of white conservative Christians is the culture of America. So it follows that if they aren’t the dominant class in the United States, then America isn’t, in their opinion, really America anymore." Exactly right
Why the Christian Right Is Obsessed With the Collapse of Civilization