The Big Question 59: Has Christianity Oppressed Women?

May 18, 2021 1472

Radio Version:

Has Christianity oppressed women?

Well, the short answer to this question, if you look back through history and even today in some ways is yes. But the longer answer, which is the more thoughtful, and accurate answer is that it’s not so simple and that Christianity has actually done far more good for women than harm.

Let’s look at how Christianity started. Of course, Christianity started with Jesus. Jesus was unlike most men in the world of that time. He showed extraordinary kindness and care towards women. And this even included women who are right on the margins of society.

Jesus welcomed women into his community of disciples, right alongside the men. In fact, one of the important reasons why Christianity grew so rapidly is because it was very attractive to women. Christians didn’t make a big issue of this, but even the opponents of the Christian faith saw this very clearly.

The philosopher Celsus disparaged Christianity in the second century as attracting foolish and low individuals with little understanding like women. And that shows you some of the attitudes of the time. The reason why Christianity attracted so many women is because it did just what Jesus had done.

It elevated women above the stifling ancient patriarchy when women were regarded as little more than household possessions. In the ancient world, Christians stood out, contrasting with the attitudes of their societies because they emphasised the care of orphans, of widows, children, the sick, the poor and those who were imprisoned.

The Christian movement put compassion and charity towards others at its heart. And all of these things were things that women who were often victims themselves found attractive and liberating. It’s no surprise, then, that in the earliest Christian churches, women exercised every function and held every position of authority in the churches.

However, in the second and third centuries, things started to change. The church had grown larger and needed more organization, and the more structure that came into the church, the more that men took over and the more that women were pushed out from the freedom and the roles that they had enjoyed before.

But despite this, the essential message of Christianity is still the same today, and its power to elevate and liberate women still remains. And it’s the Christian church that has kept this message alive throughout the centuries. That’s why, despite its mistakes, the message of Christianity has actually been responsible for the greatest changes in our world that have continued to progressively elevate women, things like universal education for all both men and women. The development of hospitals and health care systems. Achieving equality at the ballot box for women and the very idea of human rights. These were all essentially Christian initiatives based on the message of Jesus.

So overall, I reckon that Christianity has been instrumental in freeing women in the world rather than oppressing them.

Has Christianity Oppressed Women? (The Big Question 59)

Help Spread the Good News

Leave a Reply

Your email address will not be published. Required fields are marked *