DO Women have a “ROLE”?

I was talking to my boyfriend and we got in this heated argument. He seems to think that women have a role and men have a role in relationships. And by role he means women doing all the cooking, cleaning etc.

I don’t like that 1950s style dating where men think all were supposed to be is pregnant and in the kitchen 🙄. Don’t get me wrong I don’t mind cooking and cleaning but not because my man expects me too.

What are your thoughts?

Vote below to see results!