I saw that Hanna Rosin has a new book out entitled The End of Men: And the Rise of Women. I have to say that I really dislike and find distasteful the derogatory titles that these new books on men seem to find acceptable. Do authors lately ever have a title that makes men sound good, or decent or even likable? Are there any that don’t include women in the title or refer to how men relate to women? Just asking.
Seriously, titles like Manning Up: How the Rise of Women Has Turned Men into Boys or Save the Males: Why Men Matter Why Women Should Care or even Why Boys Fail: Saving Our Sons from an Educational System That’s Leaving Them Behind give the reader negative images of men that lead them to believe that men have no agency — that is, they are not autonomous, independent beings who deserve better, but rather immature characters who can’t hack it in the current system.
I am sick of these titles and wonder why anyone would buy a book that is geared toward men as failures. Certainly, few men are reading these books as most publishers only want books about men for women and therefore, take those books that make women feel good and make men look like losers for their female customers only.
If male, would you buy a book entitled The End of Men?
*****
Related at PJ Lifestyle on Rosin:
Join the conversation as a VIP Member