What is the Proper Function of Religion?
What do you look to religion for? What do you think religion should NOT do or say? Does any religion have a right to make “truth claims?” By that I mean claims as to the only way of salvation and one particular view of God and how God deals with humanity?
These are a couple of the key questions that most people have answered, “No.” But many others feel strongly that Christianity not only has the right to make such claims but that its claims are true.
What do you believe on this issue? Is it a matter of strong feelings, one way or another? Does it concern you where friends or family are on it?