I was going to say Judaism, Christianity, and Islam but since I know little of Judaism and Islam so I decided to stick primarily with Christianity. I have found myself thinking that the values preached in the Bible are not compatible with Western (ie American) values. However, it all seems to depend on who you really talk to and what he or she believes.
For example, the Jehovah's Witnesses preach a lot about how women are suppose to be submissive to men and yet at the same time preach how this does not mean that women are suppose to be seen as inferior to men. Churches do this too with many of the Bible teachings. They either gloss over the bad teachings or avoid them altogether. There's the God who will order the genocide and the killing of men, women, and children and the God who despises murder. Depending on what church you go to, you either protest gays and lesbians and tell them that God hates them or say that God loves everyone and everyone is imperfect.
Really, the Bible can pretty much say, anything you want it to say.
So what do you all think, are the values being taught by religion compatible with Western values? If so, please explain why and if not, please explain why.