What+is+Feminism?

What is Feminism? Feminism is a philosophy in which women believe that they should be treated equally to men. It is a push to improve the rights of women. The role of feminism is to ensure equality between everyone regardless of gender. Men and Woman shouldn't have different rights. Some feminists actually believe that women should have more rights than men. The picture above demostrates a woman flexing and showing off her muscles. It is a representation that woman are strong just like men are and "we can do it" too.