I haz questions.
Why is this necessary?
Has #MeToo become so powerful that somehow all men are in danger and now in need of protecting?
Have the playing fields been leveled and for generations women have been equal to men and had equal treatment and opportunities and somehow I missed it?
Have we gone for hundreds of years with the trend of sexual harassment being taken seriously by families of victims, friends, confidents, mandatory reporters, doctors, and law enforcement and I Rip-Van-Winkled it?
Are females in the workplace suddenly safe and respected everywhere? Encouraged in their aspirations with no sexual demands made of them for advancement? Have all the rape kits been tested?
Did I miss the part where women earn the same salaries as men, even the women who are brown and black? And this has been going on for decades so that it is firmly established as a matter of course, and no one questions it?
Have the tables turned, and now women have subjugated men for millennia, controlling their bodies, dreams, roles in society? Have women systematically limited a man's ability to speak for themselves, participate in government, manage their own finances, and walk down the street without a smile on their pretty little faces? And now the men are fighting back against their own oppression because they have suffered for so long?
Am I on the Starship Enterprise?