I started reading a History book of 20th century. But I could not go past a couple of pages. The description of what Germans did in present Namibia around the late 19th century was so traumatic that I could not stomach the brutality.
This kind of me got thinking: most history is actually about war, plague, natural disasters, rivalries, etc. All traumatic stuff.
Can someone here recommend me actual history books that are sweet, interesting, a pleasure to read instead of all these dark events? A book that makes the reader hopeful, more optimistic? This can be about any historic time but I have a particular interest in Renaissance.
NB. I don't want to read fiction.
by flytohappiness
1 Comment
Here we go, The Germans again…The dominant tribe in Namibia were invaders themselves. in East Africa, the Germans got rid of the slavers, out of principle. But this was after Bismarck.