Yes, the Bible should be taught in our schools because it is necessary to understand the Bible if we are to truly understand our own culture and how it came to be. The Bible has influenced every part of western culture from our art, music, and history, to our sense of fairness, charity, and business.

Jmedia personalityJoel Osteen


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *