I read Dominion by Tom Holland and I'm working my way through the New Testament alongside Bart Ehrman's New Testament textbook. I've learned a considerable amount about Christian history and the Bible itself but not as much about what Christians actually believe. Can anyone suggest me some books that provide an overview of Christian beliefs? Especially ones that focus on denominations and how they differ.
Reading the Bible has provided some insight but what's in(or not in) the Bible isn't necessarily what Christians believe, eg the trinity is arguably not in the gospels but is an essential tenant of Christian theology.
by BookooBreadCo
1 Comment
The Complete Guide to Christian Denominations by Ron Rhodes is a popular option. I haven’t read it personally, though.