I’ve recently come to the discovery that I kind of genuinely hate my body (which is wild, to say the least); like most things, I’d like to help that out with a reading list! What are some books that have taught you that have taught you to live more genuinely, rather than attempting to sell you some products?
Thanks in advance all!
by Tpan101