I never said it did so I don't know It seems to me that in order for the United States to be thought of as a Christian country, it would have been written into the Constitution. I thought the founders went out of their way to separate religion from the state so no one would EVER dare to make that claim.