Ok so I hate this job and I have hated many jobs. A couple I have liked but mostly hated to be honest. I went to college but my ****in dad picked my major and I ****in hate it too, its a specialized business major is all I'll say.
I am wondering, could opening a business be the answer?
Most businesses fail.. but you never hear those stories because people don't want to admit it failed, even if it was the economy or market or something beyond their control. To pick one example, the gym market seems saturated to the max. Is this even a feasible market to enter any more? I know that people always say you have to bust your ass if you own a business. That doesn't scare me. I want to get away from the office politics and the inferiority complex bosses, the people with no sex lives and no lives period who come to work miserable and try to bring you down because their life sucks. I hate all that shit. Just hoping to get some open opinions from you guys.