It seems to me that in the U.S. most anything religious is always considered a good thing. Nearly all media go out of their way for religious organizations in ways they rarely would for anything else. For all the religion bashing whining that the christians do in this country, I rarely see anything in public that bashes religion any more than anything or anyone else.
Here's an example I personally had:
A religious friend asked me to loan my really good camping gear his minister (who I did not know at all) for a trip. His comment to me was "Don't worry, he's a minister so you'll get it all back." I assume this means that if his friend were a plumber or bus driver he couldn't offer this guarantee?
A lot of religious organization seem to attract pedophiles and others who take advantage of their position in church youth groups to seek out victims. In many instances the church tries to cover up the abuses, yet it seems even these cases can't tarnish the "religious = good, proper and wonderful" mentality of the masses.
There are tons of other examples, but I'm sure you get the idea.
Personally, I always love the "Black Collar Crimes" section of Freethought Today. They're about the only newspaper that prints clergy crimes.
Is this the case in other countries, or is this an American trait?