Here’s the thing. I’m a sucker for zombie/apocalypse/dystopian shows and movies. I started watching The Walking Dead right after my now-8-year-old son was born and I loved it.
The first seasons were amazing, and you actually felt like the zombies (walkers, as they call them) were a threat. Season by season we started to focus more on villains and less on the flesh-eating undead. Even throughout the lesser seasons, I stuck with it.
Almost 5 years after The Walking Dead premiered, AMC introduced Fear the Walking Dead. It was set in the same zombie-infested world but followed a different group of survivors.
I really enjoyed the cast and it was nice to see the beginning of the apocalypse from the perspective of different people. The threat of the undead was real again. FTWD did end up going in a similar direction, with the villains, but I suppose that’s inevitable. You can’t have season after season of JUST fighting zombies.
There have been ups and down with Fear the Walking Dead and the recent crossover of one of the characters from The Walking Dead has been sort of meh.
I’m in the middle of watching the first episode of season 5 and I gotta say, so far it’s shaping up to be an interesting season. Looks like I’ll just be sticking with The Walking Dead and Fear the Walking Dead until they fizzle out disappointingly (like most shows).