Should Christians seek to influence the world for the
better?
For centuries the answer was clear... and it led to the preservation of life... greater liberty and justice for all... improvement in labor and economic conditions... etc.
But more recently the question has become a matter for debate...
The emphasis is on nurturing souls... with little hope of improving the condition of the culture at large.
Yet what does the Bible tell us?
Well...
... God created us to have dominion (Gen.1:28).
... Jesus was given authority over all things in heaven and on earth (Mt.28:18).
... And He commanded us to disciple the nations (Mt.28:20).
... We're to pray for those in authority (1 Tim.2:1).
... We're to love our neighbors in tangible ways (Lk.10:25-37).
... Jesus came to establish His kingdom (Mt. 3:2).
... And
this age won't come to an end until all things are subjected to Him (1 Cor.15:27-28).
This short survey alone makes it pretty clear that salvation is not limited to souls... but includes applying the good news of the gospel to all areas of life.
So... could it be the culture is in the condition it's
in because we have so little faith in the power of the gospel to transform it?