The Walking Dead reminds me of Cormac McCarthy's The Road and Jose Saramago's Blindness.
They are all 'Lord of the Flies' type tales of dystopic societies which give me the heebie jeebies. I alternately feel interest, discomfort and repugnance while reading these books; especially with regards to how quickly the stories turn to bullying, torture and rape. Which begs the question; why do I keep reading? I liken it to eating something sour; I enjoy the agitation.
But back to the Walking Dead....the one difference that I find between these other stories and that of the Walking Dead (so far) is that Walking Dead does not seem to sugar coat or put a positive spin on the narrative. I really thought it couldn't get any worse but Robert Kirkman keeps going down that spiral. I dread how this series is going to end but to be honest, if it didn't end badly, I wouldn't think it to be true to the narrative. At this point, the only ending can be a violent one, and like a deer in headlights, I don't think I will be able to look away.
On a side note, no spoilers, I swear.... I knew what was coming in season 3 but I was curious to see how a TV show, even one on a subscribe channel with R-rated content, would represent such cruelty & degradation on screen. Hint: they took a decidedly more gentle approach.