What do you guys think about TV shows like these?
So I was binge watching the tv show called “Degrassi” and if you have watched it, it shows what real life is like and it evolves around teenagers and their school and personal life. They really talk about real topics that happens everyday in life like abuse, drugs and alcohol, LGBTQ+, eating disorders, mental heath issues, real life teenagers relationships (friendships & love interests), school shootings, sex, STDs & STIs, teen pregnancy, adoption, abortion, rape, death, etc. I feel like this is one of top tv shows that aired that actually talked about real life and issues that we faced when we were teenagers that aired for decades (talking about their whole franchise) and airs new seasons on Netflix now. Personally I haven’t seen a tv show like that and mostly the ones that air are about teen pregnancy and just revolve around that and really nothing else and if they do mention something like mental health or rape or abuse, it’s really only one scene and that’s it while Degrassi really showed us throughout the whole series all types of issues that we may have experienced or known to happen to others. I really wish they would air tv shows about these types of things because I really think it has helped people out in their own situations. Question is, do you think tv shows like Degrassi should be directed and aired more on tv? Also, do you think it’s appropriate tv show for your pre-teen/teenager (ages 12 and up) should watch knowing the type of issues that happen and are talked about? Do you think tv shows like these help out or do you think it exposes topics that you rather have your child not see or hear about until they are an adult? Overall, if you have watched Degrassi or shows like that, how do you feel about it? Good or bad or something you just don’t care about?
Achieve your health goals from period to parenting.