In college, should a person major in something they love or something they know will make money?
...or, in other words, something that is more likely to lead to a high-paying career.
I ask this because, as of late, I have been feeling a lot of regret about what I majored in. I majored in professional writing (and I have a minor in creative writing), and don't me wrong, I love writing and am happy that I have a degree in it, but the past few years (I graduated in 2016) I haven't found much steady work related to my major. It gets very discouraging and frustrating at times and makes me feel like my degree is worthless, even though I worked hard for it, I love it, and I know it's not...
I find myself thinking that I should have made myself major in law, business, a science, or something in the medical field. Even though I have ZERO interest in anything related to those fields, despite hearing that they could lead to some pretty good careers. Working as a doctor, nurse, or a lawyer was just something that I hated the idea of, even as a child. I also feel that if I did get a career in one of those fields, I would probably never truly love it, find myself doing it with no real love, or care, or true passion for my work, and STILL regret my major. I know I'm no doctor, lawyer, or anything in those fields and working in one of them would just be a disaster for everyone involved...
So, what do you think? Should a person's major be based on a passion or money?
Vote below to see results!
Let's Glow!
Achieve your health goals from period to parenting.