The Walking Dead and Southern Culture: Changed but Alive

Characters of the Walking Dead

by Bailey Ford

“It’s human nature to come together.” I can’t speak to all of humanity, but at times it seems that AMC’s The Walking Dead implies this about Southern culture. While it rarely hits you over the head with Southernness of it all, taking a moment to think about the accents of many of the characters, remembering the beginning of the show, and looking at where they do their filming should remind us of a simple fact: the characters and setting of TWD are implicitly Southern. It follows a sheriff from a small town just outside of Atlanta named Rick Grimes in the years after he wakes from a coma to find that the world ended during his sleep and has been consumed by undead zombie-like creatures the show calls Walkers. While this story has a lot of Rip van Winkle undertones, the connections to the South and its culture cannot be denied.

This is a show about survivors, and most of the core cast is Southern. To me, their fight to hold on to both their lives and their humanity calls back to an argument that goes as far back as the Agrarians’ I’ll Take My Stand to the many Southerners who still share the Agrarians’ fears today: by participating in progress and globalization, the South risks losing its unique identity. In a way, this has come to pass in TWD. The entire world, as far as we know, has become a uniform wasteland, and it seems that it came from a disease that was created through science. Man could have left well enough alone, but their meddling led to catastrophe. Progress, in the end, did kill the South, and everything else.

On the other hand, it seems the Southern strength will never die. In the midst of walking dead, Southern identity, at times, still seems able to run. Whenever the cast visits big cities like Atlanta or Washington, D.C., they are in more danger than they are normally and the city is a completely overrun death trap. When they stay in their small communities, living off of nature and the land, they often face more dangers from other people than the monsters that rule the rest of the world. By honing in on the very Southern ideology of land and nature ruling over all else, the characters are able to live in a world where most have died. At this point, by living off the land, staying together, and trusting in nature, they have been able to develop a mostly safe society. To me, TWD answers the Agrarians’ and many other Southerners’ fears: Southern culture may change, but it will never die.