When you hear the word "religion"...
What do you think of? Christian concepts of God etc., or all religions in general? There seems to be a bad habit of associating the words religion and faith with JUST Christianity. Somebody will say "I'm religious" and the natural instinct seems to be for people to assume the person is Christian. There are many, many religions and faiths and Gods, Christianity isn't the sole owner of those terms. I'm just wondering why one's gut reaction is to assume people are speaking from a place of Christianity. Is it just statistics taking over?
Add Comment
Let's Glow!
Achieve your health goals from period to parenting.