I was never a school person. Tests and papers gave me aniexty. I would hate to go to classes and sit all day to learn about things I would never need to use in real life.
I was programmed growing up that in order to get a good job and be successful you needed a college degree. I went to college and stuck it out for 4 years and got my degree. I didn’t do it for myself, I did it for my parents who were so proud to see me walk up to the podium and receive a piece of paper that they could frame and hang in the house.
After I graduated I applied for jobs and went on a lot of interviews. Out of all the interviews I went on do you know how many of those companies asked to see my degree? None of them cared I had a college degree let alone wanted to see it.
I was only qualified for entry level jobs because I had no experience and those jobs paid minimum wage wheather you had a degree or not. That’s when I realized that just because you have a degree, if you are going to work in Corporate America and climb the ladder you had to start at the bottom.
I struggled for a fews years out of school, and was forced to live the cubicle life to have some kind of income to pay off my student loans. I could have easily gotten the same jobs without the student loan debt right out of high school.
Even though I sound anti college, I’m not. I made a lot of friends in college and learned how to be an adult and do things like laundry, cooking for myself, cleaning my dorm rooms, and forcing myself to go to classes even though I was hungover from the night before. I think college is a good way to help you transition from living with your parents to the real world. I just don’t think that students should put themselves in a position where they are in debt just to get a piece of paper that means nothing.
“I believe experience is the best education and if you want to succeed in your industry you need to get your foot in the door and start working.“
College is important for people that want to work in certain industries. If you want to be a lawyer, it is important to go to law school and learn the law. If you want to be a doctor, it is important to go to med school and study anatomy and medicine.
For people like me who had no idea what they want to do for the rest of there lives, college isn’t necessary.
Just because you have a college degree does not mean you are going to graduate and get a job making 100,000 dollars a year.
Do you think having a college degree guarantees success? Let me know in the comments.