I have been out of the US for over 6 consecutive years and have never heard of any of those shows you guys mentioned except Walking Dead, and thats cause it comes on Dubai One.
What happened to US television?
More garbage reality shows to distract people from what is going on around them.