Did dr lie to me?
I went to the dr to get some bumps near my vagina checked out and the doctor said it was a skin infection and nothing bad. I picked up the medication he prescribed and on the instructions it says it’s for genital warts.... did the dr lie to me? did he sugar coat it so he won’t tell me the bad news that I have genital warts? I’m so confused. I left the appointment happy because he said it wasn’t anything bad but now I’m seeing this and I’m about to get depressed again.
Add Comment
Let's Glow!
Achieve your health goals from period to parenting.