Transformation of society by the gospel

I feel uneasy with those who make grand claims of the transforming power of the gospel in society, or grand claims about the decaying power of rejecting the gospel for society. When I look at church history, it is not clear that when evangelical Christianity has dominated, society is vastly better. The sexual revolution caused problems for society, sure. but the 1940-50s had its fair share of social and sexual problems too.


I've been reading a history of evangelicalism. One of the characteristics of 18th century evangelicalism was the conviction that if you see individuals transformed by the gospel, then they will transform society. However, my history book argues, in practice, it's not that clear that this happens.

Wilberforce did help abolish the slave trade. But he remained an aristocrat and a man of his time. Meanwhile, across the Atlantic, evangelical Christianity grew in the southern colonies of America, without abolition. British evangelicals tended to be monarchists (basing their views on Scripture), Americans tended to be republicans (again, basing their views on Scripture). And so on. The historian argues that 18th century evangelicalism transformed people within their social context, rather than beyond it.

I think that's probably fair enough. God's word focuses on promoting godliness for the individual, the family and the church. The NT doesn't focus on the transformation of the society or the nation. And perhaps we are too confident in our ability to apply the word of God if we assume that we can extrapolate the correct principles to govern our societies, and that we won't be blinded by cultural prejudices.  Perhaps we are also too presumptuous about God's promises, if we assume that God promises to bless us unambiguously in these endeavors.