Hi everyone!
I’m looking for a book that will help me reconcile my relationship with my job.
I’m currently inundated with office politics and I’m struggling REALLY hard to navigate through that landscape and stay true to my integrity and values.
I just read The Gifts of Imperfection by Brene Brown and I agree with a lot of her sentiments around authenticity, but I don’t feel like I can adapt that to my daily work life, at least.
Some brief context is that I often feel like I’m asking fair and valid questions that are intended to be feedback and pushback in some cases, but I’m worried they are inherently interpreted as negative simply because they challenge the status quo. I’m not looking to leave my job, so I’m not soliciting advice for that, but it’s definitely challenging.
A lot of books in this genre like Radical Candor seem to already be bought into the ideals of corporate America but I’m wondering if there is a book out there that acknowledges the cognitive dissonance required to be in that kind of environment and advice on “faking it until you make it”. Does such a thing exist?
by Far-Slice641