The Business of Being Born

Why does everyone suddenly believe they are an expert and above all medical advice because they've watched this documentary? You see women posting on here everyday "oh, but you should watch the business of being born, it'll change you're opinion" and "I saw the business of being born, I don't fully trust doctors or hospitals".

It is good to be informed and to try and see every side of the spectrum, but I've seen women saying they won't do certain tests (glucose usually) or they won't get antibiotics for GBS etc. All because the business of being born is all about doing it natural with nothing intervening at all.

I've seen this documentary, and while its insightful, you can't assume you know everything and can give advice out to moms simply because you watched a thing. That's like me watching Grey's Anatomy and thinking I can diagnose and treat myself for serious things.

Why are women so torn up about not trusting OBs and modern medicine? It drives me insane. By all means, always question your doctor when they say they may need to do something, its good to be informed and understand things, but why are people so scared to trust someone who went to school for years and literally knows waay more than them regarding their pregnancy and health?