Hey i just finished whatching season 6 of The walking dead. Im looking to see if there are any other tv shows out there worth watching. Im open to anything so fire away
Breaking Bad is the most consistently good TV series I've seen. Game of Thrones, Dexter, Sherlock, Vikings, Hannibal all are critically acclaimed TV shows. I've also seen all of them and would definitely recommend watching them. EDIT: Also A House of Cards.
Community Brooklyn 99 Arrow Flash Agent Carter Game of Thrones Gossip Girl New Girl Impractical Jokers White Collar
Agents of Shield The Flash Arrow American Horror Story White Collar Daredevil Bates Motel Jessica Jones House of Cards The 100 Quantico Fear the Walking Dead The Walking Dead Gotham Homeland True Blood Dexter Orphan Black Supernatural The Expanse Game of Thrones Orange is the New Black I'll just edit more in because I'm probably forgetting some.
1) Game of Thrones 2) The 100 3) Gotham "Thank you guys so much for all the TV shows, im think i might check out Flash" Nooo... you're making a mistake.