I'm looking for books about a dystopian society where it's citizen think they're living in a Utopia (except for the MC and a few minority).
And I'm not talking about a False Utopia, where everything seemed fine on the surface but something sinister is going on behind the scene.
Though the citizen in the story perceive they're living in a Utopia (perhaps through brainwashing or propaganda etc.), it's clear to us reader it's a dystopian.
by zorroelk