There are absolutely certain professions where a college degree is necessary. But, is this the case for the majority of Americans? We all can’t be brain surgeons or architects. So, is a pricey degree that buries you in debt all while having to navigate constant political indoctrination really worth it?
It used to be that a college degree was the ticket to a better life. Once you put the work in and paid your dues, you were almost guaranteed a better life. Those days are seemingly gone. College campuses have become brainwashing camps for ideological indoctrination. A place where agendas are fulfilled and critical thinking skills are destroyed.
This all begs the question of is college really necessary? You can go to a trade school where job demand is incredibly high. You won’t be swallowed in debt with a useless degree in underwater basket weaving. You can learn a skill that is functional and will start making you money right away. This is what a lot of people are starting to realize.
The scam of college is being exposed. You are forced to pay a premium to take classes that you have no interest in. All this to fulfill requirements set by individuals who have no interest in your future other than you paying your tuition on time. Many companies are now openly stating they no longer are requiring a college education. This speaks volumes to the lack of value these high priced degrees actually offer.