Categories

Women: Not Feminists

Modern feminism debunked

The dictionary definition of feminism is simply “believing in and working towards the economic, social, and political equality of the genders yet it has gone much further in practise. The modern leftist feminism has become yet another monster in the room.

Feminism has become become more about women dominating and controlling men yet it’s also become another thing to being hijacked by people with strong political views. Under the umbrella of feminism, feminists march on the streets demanding that presidents must go and their view of the world is correct and if you don’t like it, you’d better shut the fuck up because they’re right. But there are all sorts of feminists, some push for equal pay

Continue reading Women: Not Feminists