Sex Education

Recently, I see so many posts of young women asking questions about how to do this and that and/or how to give oral, etc. I'm just curious, have you learned anything about safe sex or so you get most information from porn? (Which is nothing like actually sex!) 
I'm a junior social work major than plans to teach more young people about sex and how to BE SAFE!
Do parents teach their children about sex anymore? Or did that stop in the 90s?