As a proud feminist, I have to say it’s been great to see so many TV shows making big steps in the right direction for female representation.
Gone are the damsels in distress and the over-dramatic “crying girls” who sit on the sidelines.
Women on TV these days get
shit stuff done. Plain and simple.
They are leaders and doctors and assassins, they fall in love and have families, they go a little crazy sometimes, and they're not afraid to kick some ass. Basically, they just spend their time being intense, complex human beings, which has, sadly, not always been the case for female characterization on television.
These are some of the shows that are letting their feminist flag fly, and I couldn’t be more excited about it.
Did we miss any other fabulous shows that rock the feminist agenda? Tell us about it in comments below!