Giving but not receiving oral sex

If your boyfriend told you that giving oral sex is not something he is ever going to do, not because he doesn’t like it but because he believes that is not something a guy should do. Would you think that’s patriarchic or even anti feminist? What are you thoughts?