The economics of computing for all has been getting some great press, and rightfully so: it’s full of great videos, great stats, and great resources. I also think it has a great mission: there are hundreds of thousands of businesses who need talented software developers in order to grow and provide value, but these businesses can’t find the engineers they need. Moreover, people need jobs and software development jobs are abundant and high quality. Hence the need for more students, more teachers, and more classes in computing. Win, win, right?

I don’t think so. I do believe in this mission. I do research on this mission. I feel strongly that if we don’t massively increase the number of teachers in computing, we’ll get nowhere. But I don’t think that by simply increasing the number of people who can code, we’ll address this gap. This is because the problem, as frames it, is one of quantity, where as the problem is actually about quality.

To put it simply, companies don’t need more developers, they need better developers. The Googles, Facebooks, Apples, and Microsofts of the world get plenty of applicants for jobs, they just don’t get applicants that are good enough. And the rest of the companies in the world, while they can hire, are forced to hire developers who often lack the talent to create great software, leading to a world of poor quality, broken software. Sure, just training more developers might increase the tiny fraction who are great, but that seems like a terribly inefficient way of training more great developers.

This brings us back to teaching. We absolutely need more teachers, but more importantly we need more excellent teachers and excellent learning opportunities. We need the kind of learning infrastructure that empowers every 15 year old who’s never seen a line of code to become as good as your typical CMU, Stanford, Berkely, or UW CS grad, without necessarily having to go to those specific schools. (They don’t have the capacity for the kind of growth, nor should they). We need to understand what excellent software development is, so we can discover ways to help developers achieve it.

This infrastructure is going to be difficult to create. For one, there are going to be a tiny fraction of excellent developers who choose to choose to take a 50% pay cut to teach in a high school or university, and yet we need those engineers to impart their expertise somehow. We need to understand how to create excellent computing teachers and how to empower them to create excellent developers. We need to learn how to make computing education efficient, so that graduates in computing and information sciences have 4 years of actual practice, rather than 4 years of ineffective lectures. We need an academic climate that respects current modes of computing education as largely broken and ineffective for all but the best and brightest self-taught.

Unfortunately, all of this is going to take significant investment. The public and the most profitable of our technology companies must reach deep into their pockets to fund this research, this training, and this growth that they and our world so desperately needs. And so kudos to and every other bottom up effort to democratize computing, but it’s not enough: we need real resources from the top to create real change.

2 thoughts on “The economics of computing for all

  1. I agree with the content of this post, but I think increasing the pool of excellent developers will take more than improving the university system of education. As a fan of the Dreyfus model of skill acquisition, I think that the education required to take someone from “novice” through “advanced beginner” to “competent” is very different from the education required to get someone from “competent” to “proficient” or “expert”.

    In particular, I think getting someone above competent requires apprenticeship: working closely with other excellent developers. And I don’t think this type of apprenticeship model really works with a traditional undergraduate education. This type of education requires having junior developers learning from practitioners in an industry setting. I therefore have little sympathy from companies who bemoan the lack of talent and aren’t willing to hire someone junior and invest in the mentoring process.

    So, yes, by all means, let’s improve undergraduate education, and maybe even develop software engineering(!) curricula. But I don’t think putting the existing excellent developers in the classroom is the right way to have them impart their expertise.

    • Yes, spot on. Four years of college, no matter how good, will not lead to experts, but it will lead to some level of competency. That will take more extensive practice and apprenticeship that occurs for many years after school, just as it does now on software teams around the world with new hires out of school.

      But all of this practice and apprenticeship, no matter ho good, needs a solid foundation, which most students in computing and information sciences do not get. When I talked about bringing excellent developers into the classroom, I meant as teachers. Whether those are teachers who become excellent developers, or developers who become excellent teachers, the people guiding instruction need to know their stuff.

      I don’t see any reason to stick with traditional undergraduate education. And as a professor at UW who teaches a lot of undergraduates about design and software engineering, I try hard at both the class and curriculum level to define experiences that give students real practice.

Leave a Reply

Your email address will not be published. Required fields are marked *