Do you feel the impact of Scripture has been minimalized, and do you think it will grow less and less important over time?
There are a lot of posts on here that tend to poke at Christians about the things in the bible that modern day Christians tend to ignore; slavery, incest, genocide, rape, murder, child sacrifice, etc etc.
It's kinda silly to sit here and ask Christians if slavery is ok (althought much to my shock some actually said yes...that was kinda f***ed up) when there are virtually no Christian slave owners to parade around the bible as justification for owning slaves.
I was curious if Christians are willing to say that the impact of the scripture has been minimalized. By that I mean very few follow the 613 commandments, there are almost no fundamentalists left, there are extremely few literalists and they tend to differ on what they believe. The lines between Christian denominations are melting and new age Christianity (mega churches and the like) are rising in popularity.
There is much less importance placed on scripture these days, IMHO, and much more being placed on common sense and empathy. There is less emphasis on people needing to look to the bible for guidance, which could partly explain why so few Christians regularly read the bible and why atheists have proven to be just as educated if not more so on Christian scripture.
So, do you agree that Christianity is changing and the need to rely on scripture for instruction is fading? If yes, is it a bad thing? If no, why?
And also, if yes, do you think that this trend will continue or do you think it will reverse at some point and people will come back to making the bible an important part of their lives?