Pages Navigation Menu

Faith One Blog

A Christian World?

Christians today are not what they were earlier in history. Up to a hundred or so years ago, they represented a dominant world-changing force that saw the world as Christ’s kingdom, a kingdom not yet fully realized but one that would surely come to its own in the not too distant future. They saw themselves as Christ’s emissaries, responsible to Him for making this world a Christian world. They felt the need to do whatever they could, and wherever they had influence, to eliminate godless institutions, godless activities and godless thinking and replace them with their God ordained alternatives. They realized that their individual contributions might be quite miniscule in the grand scheme of things but they persevered anyway in the hope (and knowledge) that they were not alone, that God had His 7,000 that had never bowed the knee to Baal (Rom. 11:4).

Indeed, they were not alone; they became the dominant cultural driving force in the West and were making great progress throughout the world. Robert Dabney wrote in the late 19th century, “Christianity now “walks in her silver slippers.” She was the queen that no one, however disposed in their hearts, dared to speak out against. Those that did were held in contempt by the public and lost any respect they might have had previously. The Christian faith so dominated the culture that, in 1892 the U.S. Supreme Court unanimously stated “This is a religious people.” They went on to conclude, through an extensive 300-year study of legal documents, that this is a Christian nation.

Today, the situation in America is almost reversed. Christianity has been excluded from every level of the school system. It was first ignored and later mocked by the great bulk of the entertainment media. Our laws are steadily moving away from their earlier biblical character toward an ever more vulgar humanistic degeneracy. There seems to be a greater stigma against being a Christian than being an atheist. Why? Why did our culture shift from a strongly Christian to an almost anti-Christian perspective in so short a time? Christians are still in the majority in America but where they previously were the leaders of the culture, they now follow it, looking for ever dwindling scraps of recognition from their new masters.

The answer to this question lies in the hearts of Christians themselves. They differ from their predecessors of a century or two ago but not in their basic faith or even in their desire that people everywhere should believe and worship as they do. The difference lies in their understanding of the responsibilities God has given them. Throughout the Christian era and well into the 19th century, the great majority of Christians believed that the Christian faith would eventually be universal, that it would cover the earth as the waters cover the sea (Isa. 11:9; Hab. 2:14). They saw themselves, in whatever capacity they lived, as God appointed emissaries that would do their parts to bring about this change. They didn’t leave this work to the pastors and missionaries but felt that, in everything they said or did, they represented their Lord and were careful to support the fact of His authority over all things. He was seen as the present Lord of lords and King of kings, not just for one’s personal behavior or just for Christians, but over all of life for all mankind.

This broad application of God’s word and God’s law included civil government. Christians of that day held their representatives and those in authority at every level of government, responsible for following God’s word. They felt that taking the oath of office with one’s hand on the Bible, meant something, and they held them responsible for any deviation, not just from the law, but primarily from what the Bible taught. Laws or court judgments that were seen to be in contradiction to God’s law evoked a barrage of letters to Congress, the Supreme Court and other government agencies as well as to the newspapers. Those Christians were activists!

All of this though is pretty much a thing of the past. Today’s Christians do not feel this sense of responsibility for Christ’s kingdom. They have been told over and over again in many different ways that this world is Satan’s world, that he is the god of this world, and that it will continue to deteriorate until Jesus returns. The great majority of evangelical Christian churches have been propagating this falsehood for almost a century, and it has thoroughly penetrated the thinking of Christians throughout the world. Christians go to church each Sunday expecting to hear God’s word; some of which they hear; but one thing they rarely hear is any reminder of the fact that they are responsible for this world. They may hear of the need to live in obedience to God in all they say and do; they may hear of the need to pray that God would correct, or at least slow down the slide into depravity we see in the schools, the media and government. But they never hear that it is the Christian neglect of the world that has caused it to become so depraved. Unbelief, ever ready to fill the vacuum left behind, was quick to do so and is now well established.

So, Christian, are you upset and disgusted with the ever-increasing godlessness in the world around you? You don’t have very far to look to discover the source of the problem. It is you, yourself! You are to blame for not having obeyed your Lord. Yes, you can say you didn’t know, that you were taught otherwise, and that claim has some validity; but we are all responsible to study God’s word for ourselves. When we fail to do so, we fall prey to two sorts of deceivers: those with evil intent that see the church solely as lambs to be exploited, and the well meaning ignorant, those that have themselves been taken in by false interpretations of Scripture.

The seminaries of just about every denomination are graduating thousands of the latter, mostly well meaning individuals, to be the pastors of America’s churches each year. Pessimism has so thoroughly permeated the seminaries and Bible schools and most pastors have become so hardened in this wrong view that there is little hope they will change without an outcry from the Christians in the pews. So, as you have been the source of America’s cultural decline, so also are you its only hope for recovery. If in this generation, the vitality of America’s Christians is to be restored, you must make yourself heard in protest against cultural decline but also against the weakness in the pulpits that has silenced the voices of America’s many Christians. May God bless you as you join together with others like you to make this world a Christian world.