According to Merriam-Webster, feminism is defined as - 1: the theory of the political, economic, and social equality of the sexes; 2: organized activity on behalf of women's rights and interests. Cambridge Dictionary defines it as "the belief that women should be allowed the same rights, power, and opportunities as men and be treated in the same way, or the set of activities intended to achieve this state."
After watching and documenting the antics of today's so-called "feminists," over the past few years, I have come to the conclusion that what used to be a fight for "equal rights," has become a disease where "victimhood," and "male bashing," literally defines those that call themselves feminists. [Full article]