I recently watched the documentary “The Mask You Live In” and it’s really got me wanting to explore the idea of masculinity more. Trying to find recommendations only brings up books about how to perform masculinity in a traditional sense like how to be a protector alpha type stuff. I’m looking for books that are more sociology focused. Books that focus on western societies portrayal of masculinity, the negative effects of it, and how to address it.
by Username1821