I just finished reading Delirium by Lauren Oliver, a popular YA book. (link courtesy of Amazon.com) It's an interesting look at the future--another sort of dystopian society ravenged by war. It is reminiscent of Hunger Games and The Giver. The premise of the book is when the young adults of the society reach 18, they are "cured" of the disease of love through surgery before the "deliria" sets in. So you have a society of basically zombies who don't show emotions or feelings, even to their children.
So of course, the main character, Lena is scheduled to have her surgery soon after she graduates from high school. But before then, she falls in love--or as the book puts it--she becomes "infected" with the disease. Her mother had the same "illness" so all her life, Lena has dreaded it. But the "disease" allows her to see the world in a new light--she realizes colors are brighter, sounds are clearer and her town looks very ragged and run down. Maybe the government control isn't all it seems.
She makes a visit to The Wilds with Alex and finds a whole society living outside the law. And she begins to understand everything she has been taught and everything she has believed has basically been a lie. The ending of the book wasn't what I expected. But I was hoping for a happier outcome after the sadness of the book--although that wouldn't have really been fitting.
I find it interesting how popular these dystopian society themes are right now. These books are flying off the shelves. Are kids --and adults--feeling things are so out of control in our world right now? Does reading about a worse society make us feel better? Or is it a warning of what people fear will come? I'm not sure--maybe it's just a marketing trend like vampires and werewolves--the latest "hot" topic. I do know I enjoyed the book and will be recommending it to my students!