Religion in the United States
Jump to navigation Jump to search
Religion in the United States is characterized by a diversity of religious beliefs and practices. A majority of Americans report that religion plays a very important role in their lives, a proportion unique among developed countries
|This theme article is a stub. You can help Wikiquote by expanding it.|
- Kevin Kruse in his book One Nation Under God: How Corporate America Invented Christian America details how industrialists in the 1930s and 1940s poured money and resources into an effort to silence the social witness of the mainstream church, which was home to many radicals, socialists and proponents of the New Deal. These corporatists promoted and funded a brand of Christianity—which is today dominant—that conflates faith with free enterprise and American exceptionalism. The rich are rich, this creed goes, not because they are greedy or privileged, not because they use their power to their own advantage, not because they oppress the poor and the vulnerable, but because they are blessed. And if we have enough faith, this heretical form of Christianity claims, God will bless the rest of us too. It is an inversion of the central message of the Gospel. You don’t need to spend three years at Harvard Divinity School as I did to figure that out.