Where did the idea that what is physical does not matter? Where did we get the idea in our culture that God really wants to rid us of the physical nature, from our world and our bodies, in order to truly liberate us? Wherever it comes from, it leads to abuse of our planet and our body, and we do it in the name of Christ. Does the Bible warrant us to discard creation care and physical wellness?
Perhaps the danger of such ideas is more dangerous than we at first assumed. Perhaps it is actually fraternizing with heresy.
Watch this video and see what you think: