If you haven’t seen it yet, don’t worry, there are no spoilers in this article. If you have seen The Walking Dead Season 7 premiere, then you already know what happened, so try to keep your comments generic in case there are those that have not seen it yet.
We all knew the cliffhanger from last season, we all knew something was going to happen in the premiere of Season 7. It was quite a bit to absorb, and I am sure everyone has an opinion on why and how the events unfolded, and what it will do for the show moving forward.
The creators of the show and the executive producers said they knew this was going to happen for almost 2 years now. In retrospect, I think that is a great deal of pressure to carry around as they tried to create a story-line around all these characters, and how it would impact the show as it now progresses.
While I believe we will never have a Zombie Apocalypse, I do believe that there are plenty of other natural ways that could lead us as a society down the road that this show has led us. It could be a cataclysmic asteroid that hits the earth. Or maybe a batch of natural hurricanes one after another coupled with earthquakes around the world. We truly can never know what might happen that could cripple us to be permanently without power or resources.
The Walking Dead has portrayed that the Zombies were the evil of the world, soulless creatures with a drive to eat as many people who are still alive as possible.
It’s not the Zombies that are the evil in the world. It’s those few who choose to be evil that are still alive. There is always a choice between good and evil and that has always defined us as who we are. The decisions we make, and will make are what drives the cog of this world that we live on.
Some people are commenting that this episode was too violent. Yes, it had it’s moments. To judge this show based on the violence that was shown toward people who were alive, and not the walking dead zombies in the show is where people are drawing the line. There is no difference in what Negan did on the show than if a robber was to shoot a store clerk during a robbery, or if a person who has been drinking caused and accident that may have killed innocent people.
I think The Walking Dead woke us up. I think the producers and writers showed us that evil doesn’t go away. It shows that people will be who they are no matter what. I am devastated like everyone else on what has occurred. Do I think it has gone too far? No. If anything, I am hoping it reminds us to cherish what we have and to enjoy life with others in the best way possible. I think it also reminds us to remain strong and to prepare ourselves for what society has already become.
The world is an evil place. The world is a scary place. We don’t need zombies to show us the worse of what the world has become. Generations of war, ignorance, and racism has already defined us. Keep becoming better every day. Help your fellow man. Stop hating and judging other people. If these things occur, I think we all will be ready to survive if there was ever a Zombie Apocalypse.
What are your thoughts? Don’t mention specifics. But what do you think it all represents? Would love to hear your thoughts below. Comment and Discuss. ~Tom
Disclosure of Material Connection: I have not received any compensation for writing this post. I have no material connection to the brands, products, or services that I have mentioned. I am disclosing this in accordance with the Federal Trade Commission's 16 CFR, Part 255: "Guides Concerning the Use of Endorsements and Testimonials in Advertising."