ap260495254825.jpg

Higher education costs in the United States have soared over the last two decades – but some employers say universities are doing even less to prepare students for work in the global job market, as the traditional liberal arts education becomes decreasingly applicable to our fast-paced world of technology.

Yet universities themselves are far from obsolete – they still serve a vital role in providing young adults with an intensive, well-rounded education and preparing them to enter the workforce, as well as guiding them toward developing greater independence, responsibility and interpersonal skills.

So how to strike a balance between what the job market needs and what universities and colleges can do?

One suggestion comes from Monica Herk, vice president of education research at the Committee for Economic Development, who recommends that institutions of higher education begin to shift their focus more toward providing training and certification in certain “competencies” that are in especially high demand among employers: critical thinking, for instance, or communication abilities, or more specialized skills like knowledge of a particular programming language.

While current degrees serve mostly as “signals” to employers that graduates likely possess these skills, a competency-based education would be better able to provide students with those necessary abilities and prove to employers that they have done so, argues Herk.

Under the current structure, she says, “much of what students spend their four or two years studying may be largely unrelated to the skills and knowledge that employers are seeking.”

While this doesn’t mean that the liberal arts education itself is flawed or inadequate, Herk continues, it does suggest that “it may be unfair and inefficient to require middle-income and poorer families to go into debt to purchase a 2- or 4-year degree consisting of many classes that are largely irrelevant to their goal of obtaining a job, especially when such degrees cost upwards of $16,000 per year in tuition, not to mention forgone earnings.”

Instead, she suggests, education should consider an “unbundling” strategy, similar in some ways to how cable companies have begun to detach different options from their traditional channel packages in order to give customers what they want. Just as someone can now get ESPN, for example, without needing to pay for two dozen other sports channels, so could universities and colleges make some courses and specializations available independent of the traditional major track.

“Individuals will no longer be required to ‘purchase’ a 2- or 4-year degree from a single vendor,” explains Herk. “They will be able to piece together education in a variety of skills from a variety of sources – including their own labor force experience – as long as they can demonstrate their mastery of those competencies through valid assessments that employers trust.”

Of course, such a change wouldn’t happen overnight, nor would it fully replace traditional four-year university tracks. Rather, it could complement existing structures and better serve career-minded students and employers.

This new structure would also require a new system of assessment beyond current standardized tests, points out Robert Litan for the Wall Street Journal’s Washington Wire. The development and adoption of such tools, he writes, should be a major research priority in academic and business circles alike.

“Once some are validated by employers as reliable indicators of future work performance – and, with hope, better than the SAT is of college performance — the education market finally should be disrupted,” he writes. “A highly competitive market among educational institutions of all types will lead to programs that help students of all ages gain those certifications more cost-effectively than is the case now.”

Like this? You’ll love these…

Lifting women to the top of the UK’s higher education ladder

 

Is LinkedIn the new digital face of higher education?