Who ever said that the bible was the guide to Christianity?
For example, Jesus never said to follow the scriptures or said to the apostles 'write these things down for all to know' or any other such things.
What ever happened to the idea of Christianity before the bible was ever compiled?
Interesting that the Catholic church produced the bible for their own liturgical purposes and other people think 'Hey, let's make a faith based on this Catholic book.....weird