The bible is held up by many christians as the word of God, and as such many Christians base their entire moral fabric on the bible which they perceive figuartivly penned of God.
So do the posters on this site, be they christian, athiest or agnostic agree that the books that make up the bible should be held to the light and if they are myths, and fiction that this should be exposed?