Christian America

Christian America and the Kingdom of God

Submitted by AUVAMaster on Thu, 11/24/2016 - 07:57

"The idea of the United States as a Christian nation is a powerful, seductive, and potentially destructive theme in American life, culture, and politics. Many fundamentalist and evangelical leaders routinely promote this notion, and millions of Americans simply assume the Christian character of the United States. And yet, as Richard T.