There's no denying that zombies are everywhere these days. From the popularity of The Walking Dead to Brad Pitt's summer blockbuster World War Z, you'd have to be from the land of the undead yourself to not be aware of this trend.
But just where did
zombies come from and what do they smell like? And when did they first become a part of our cultural awareness?
(thanks Kate)
0 comment(s):
Post a Comment