The original post: /r/television by /u/Educational_Word6447 on 2026-01-18 22:03:47+00:00.
Can anyone explain to me what happened to good television? I mean, look, I am not the guy that just sits around and watches television incessantly. But, I am old enough to remember when they had hood shows on television with story lines and lessons to be learned. Families could actually see themselves in the characters and tribulations of said characters.
With the past 20 years being filled with mind numbing shows that push sex on our children, blood and guts and the like. These were always reserved for the movies. Why did we allow this to happen? Did we really decide we didn’t want to watch shows that we could see ourselves in? Did we decide that the lessons learned from shows like Home Improvement, Fresh Prince, Family Matters, etc. were no longer worth learning? I get many may no longer hold those same values but good lord is television even worth it anymore?