Looking for fiction or nonfiction books that criticize society (particularly modern-ish society) in a blatant or more subtle way.
I’m a sociology major so I’m looking for all different perspectives. I do have a bias for critiquing capitalistic/individualistic societies that love money and power more than humanity, so if there’s books based around that, I’d love to hear them!
by intothevoid444