I am currently reading this book. The author is opinionated and not a fundamentalist or evangelist for certain. My experience is so Eastern American. He argues that Europe and other Christian cultures do not have the loonies that America has. According to the author, religion is on the decline around the world. Mainline religions are losing their prominence due to science and education. He argues that America has more than its share of ignorant religions that cater to the uneducated. Mormonism and Jehovah's Witnesses success here is central to his argument. Lutherans, Presbyterians, Episcopalians, Methodists, and Reformed churches are fading. Pentecostal and evangelical are soaring in numbers.
I don't know enough to evaluate his argument. He mentions the English Civil War and the American Civil War as primarily religious wars. Another claim is that one's religion is a very strong predictor of party loyalty and ultimately voting for public officials. I wonder what other countries would tolerate Mormon beliefs in a leader.
My gut is drawn to his argument. People who believe women should have no reproductive rights, that the Flood happened, that America is God's country and exceptional are dangerous.
He sees the rise of Jehovah's Witnesses and Mormons as a sad sign of the times, having little to do with actual belief. Rational belief is suspended. Someone who believes and wears magic underwear is running at the head of a major party.
I was raised to give great deference to everyone's religious belief. A product of public school education. Reading his book, I am reminded of many posters here who think religion is bunk.
Part of the problem with deference and a public school education is that we know very little religious history. I'm not ready to be Bill Mahr but I was surprised by much of what I read. The English Civil War was clearly religious but I am uncertain about the American Civil War. He states that religious denomination determined which side you were on.