I’ve been reflecting today on the fact that most of my life I’ve read books by predominantly Western (American and European) authors. I want to learn more about non-Western cultures through novels because I love stories. I’m not really seeking non-fiction.
Please name me some good fiction reads by non-Western authors that portray their culture realistically. I would love to read books where they mention their day-to-day life, their customs, beliefs, mental health, food, medicine, healing methods, absolutely everything that I can learn through a novel.
Thank you so much! 💕
by AcademicPreference54
2 Comments
i’m sure there’s a lot of western style western style tv series based on african culture. i’ve heard that many western viewers will look at the top of their books in a variety of ways, as well as how they look african – american. so far, they’ve all been very successful.
I think that [The Good Earth](https://www.goodreads.com/book/show/1078.The_Good_Earth?) by Pearl S. Buck fits the description. She was an American, but she grew up in China in the early 20th century.