I’ll admit up-front that I have a PhD and a four-year engineering degree. I feel like I benefited from both, and would totally chose to follow the same path again if I had the opportunity. I had a great time at university, both times. I learned a huge number of extremely valuable skills. Despite this, I think there are a few issues with the “everybody should get a four-year degree” approach.
The first issue has to do with teaching approaches. Many universities follow a “theory first” approach, with the first few years focusing on basic theory, mathematics, and various foundational skills. Practice comes later, and is often left to ill-defined and unmentored “project work”. The right students get a lot out of this structure, but many don’t. Theory first is not the only approach to producing the same end product in the same amount of time, but alternative approaches in four-year programs are still very much the exception. “Experience first” and “practice first” approaches can be just as valuable.
The second issue has to do with the kinds of work done by software developers, software engineers, programmers, coders, and similar occupations. I’m a theory-heavy guy, and like to have my work be both practically and theoretically challenging. I’ve been lucky enough to find that environment. On the other hand, much of the work in the industry isn’t theoretically challenging (but frequently has challenges in other areas). For many of the positions in the industry, a trade school approach is likely to give graduates a better ability to do the job than a theory heavy computer science degree. Computer science isn’t the right program for every person paid to do software.
The fourth issue has to do with cost and opportunity. University is expensive, both in time and money. A potentially very valuable investment, to be sure, but not everybody has time or money to invest. Giving people alternative paths into the industry is good for both them and the industry.
As the demand for new software developers continues to grow, the growth of the ranks of beginners will outpace progression to proficiency, causing an industry-wide drop in productivity. We cannot tolerate such a step backward, as software continues to grow in demand and importance.
Experience is extremely valuable, and many experienced developers are very productive, but the claim that too many newbies will water down the pool and “can’t be tolerated” is absurd. For a start, the industry is not a homogeneous pool. There is a very wide diversity of work to do, companies to do it at, problems to solve, and reasons to solve those problems. Bringing new talent into the industry will quickly pay off in expanding the capabilities of the industry.
I’m strongly in favor of more people in the industry learning and applying more theory. Insisting that learning must be “theory first”, though, isn’t the right way to do that.
For me, the benefit of my undergraduate education was in the “practical” aspects. I took a lot of project-based classes like operating systems, mechatronics, embedded systems, and compilers, in which I got my hands dirty working on challenging projects. It is unlikely that a junior dev would be hired to work on such projects (an OS kernel, an electromechanical control system, embedded FPGA code and firmware, and a compiler, respectively) in industry, since doing those sorts of projects well require a great deal of experience, and failure in production can be quite damaging. For me, the main benefit of an undergraduate education was being allowed to take on ambitious projects in a setting in which failure was not so expensive.
See also the response “The Solution of Hacker Schools”
Some previous discussion on these bootcamps and the problems with them: