Why do many Christians deny that they practice any sort of "religion" even though the Bible clearly speaks so?
And is this largely a characteristic of American Evangelical and Fundamentalist Christianities, or is this seen in Protestant Christianities throughout the world?
No comments:
Post a Comment