I have been in the US for about two years now, as a grad student, and have not really gotten a chance to develop a perspective of the country, one that a native American would have, I presume. I am quite familiar with American culture—food, music, politics, history, etc.—but I feel like I do not have an insider's perspective on the US, as a place, yet. What are some cool books (excluding fiction) that you would recommend, books that might inform me about some interesting aspects of the country, geographically and culturally, and, most of all, spur me to be go and explore it. Thanks in advance!
by nostalgiadoper