All religions make the claim of being the right religion and that their adherents are being led by God, which means they are taught that they are God's people or that they are the "church" which word is so often mentioned in the New Testament. Who teaches them that they are God's people? The leaders, whether it be a group of 12 men, or a pastor, minister, Pope, etc. of a church, who claim they have received a "calling" to minister, etc.
Some are taught just to say a simple prayer to Jesus that they are sinners and ask to be forgiven of all their sins, and they become a "child of God" and claim "indwelling of the holy spirit." Others are taught to confess their sins in a confessional to a priest, who tells them their sins are forgiven.
Still others are taught that they must become part of an organization which is headed by Jesus Christ, and he reveals all of God's purposes thru a body of men who then dispenses it in literature form for the rest of those in the organization.
All groups use the Bible as proof of their claims. And perhaps all these religions have done more harm to the human race than good.
That's my 2 cents.