Women's Rights

I am often told (and I think our friend Dan Brown tries to make this point) that after the rise of Christianity, women really got the raw end of the deal and that Christianity was/is a patriarchal religion that prides/d itself on supressing the role of women in religious and political life.

Just thought of this though: what was the official and heavily practiced religion that existed at the time that we see the first women emperor (she actually referred to herself with the masculine term, as opposed to empress)in the Roman Empire? You got it, Christianity.

In fact, the Emperor Irene called the 7th ecumenical council.

And in our "enlightened" times, we Americans - some 1200 years later - have never elected a woman as head of state. Something to think about.

Comments

Pintradex said…
I remember reading a book about the early Church. It said how a gang of men raped some women in some Roman town. There was nothing that could be done under Roman law, but the local bishop withheld communion from the entire town until the criminals were punished. For the life of me I cannot remember the title of the book.

Political correctness does not try to eliminate negative stereotypes. It keeps one conveniently on hand when we need someone to blame for patriarchy, facism, intolerance. I believe that an unbiased study of history (the above anecdote probably being one of many examples) will show that the status of women improved dramatically with the spread of Christianity. However, we would not want to offend modern sensabilities by complicating the issue with historical fact.

Popular Posts