Our society has rejected a meaningful concept of truth and is becoming more hostile to believers. How did this happen? We can trace the West’s cultural decline from various influences including the Enlightenment, theological liberalism and new age mysticism. But how should Christians respond when Western culture seems to be collapsing? How can we be salt and light in our world?