Attention

The opinions expressed by columnists are their own and do not represent our advertisers

Saturday, March 17, 2018

The American Man in Crisis

What has and will become of America's men? Nothing good if current trends continue.

We live in a time when feminists are bemoaning that America has devolved into a misogynist dystopia like “The Handmaid’s Tale,” and when Hollywood and the political scene are overrun with #MeToo moments of sexual abuse. Men, guilty of “toxic masculinity,” are villains preying upon women and stripping them of their rights. Yet women are more empowered and “equal” than ever. Now the question is, what has and will become of America’s men? Nothing good if current trends continue.

Michael Ian Black, a comedian and actor, recently wrote in a New York Times op-ed:

The past 50 years have redefined what it means to be female in America. Girls today are told that they can do anything, be anyone. They’ve absorbed the message: They’re outperforming boys in school at every level. But it isn’t just about performance. To be a girl today is to be the beneficiary of decades of conversation about the complexities of womanhood, its many forms and expressions.

Boys, though, have been left behind. No commensurate movement has emerged to help them navigate toward a full expression of their gender. It’s no longer enough to “be a man” — we no longer even know what that means.

More

2 comments:

Anonymous said...

City of Salisbury is loaded with them

Anonymous said...

Just one of many ways Democrats trying to destroy America.