I saw yesterday this was the question for today, and I am struggling to think of one.
I can't think of any books I have read as an adult that have changed my views. I read Rebecca Rosen's book on talking to the dead, so I was convinced for a few weeks I had that gift - then realized I do not. Nor do I really want it.
As a kid, I read all the time - Baby-Sitters Club books were my main jam, but I also read Beverly Clearly, the Sweet Valley Twin books and Judy Blume. I learned everything I know about periods, breasts, kissing, and bras from Judy Blume. So I would say her books have had the most effect on me, and made me realize that being a girl was going to be a pain in the ass for years to come.