Getting a college education
I went to college for one year then decided to drop out. It just wasn't for me. I really hated going to classes that I had no interest at all in earning a degree that may or may not help me accomplish what I want in life.
What frustrates me so much is that people think you HAVE to go to college in order to be successful at all in life. People think that if you don't go to college after high school you're doomed.
It's so interesting to me that high schoolers are so incredibly pressured to go to college right after high school. Kids who have no idea what they want to do for the rest of their lives are going into insane amounts of debt to obtain a degree that they settled for.
I guess my point is, it's okay to not go to college. And it's especially okay to take a few months, semesters, or years off of school after high school to figure out what you want to do. To travel and learn who you are and what you desperately want to know more about.
Now, a few years later I'm going to school to become a midwife. If I had stayed in college, I would have never known that helping women in their most vulnerable time in their lives was a huge passion of mine.
Take time off, learn who you are. Do what you are made to do and settle for nothing less.
What do you think?