1/23/2020
The software industry is synonmous with rapid innovation. New discoveries require changing to better ways of doing things. For programmers to stay relevant they need to constantly keep up on the latest technologies, otherwise their skills will go out of date. At least, that’s what I’ve been told, but the more I study computer science from 50 years ago, the more I find “new” ideas are actually old. Perhaps the changes we see are only permutations of underlying ideas.
In many obvious ways software does change. Languages, libraries, and tools get replaced overtime. Nobody is writing desktop applications in assembly anymore. Many developers who wrote Windows applications moved on to mobile or web. But rarely are these changes due to a technological advancement or dramatic in change in how things are done, just new forms of popular products, or improved hardware that gives us some wiggle room.
A good engineer can adapt to these changes, in the same way they can switch to a company which uses different tools and processes. Learning Python after C# should be easy, because you understand programs, not the syntax. Moving to a language with significant design differences like Haskell requires a broader understanding of functions and computation. Now imagine there was similar knowledge that helped you approach every computer problem and perhaps all of nature. The knowledge I am describing is math and science! For software it consists of computer science subjects like OS theory, algorithms, software design, logic, and calculus. Because many programmers lack this, they struggle with change, and feel pressure to keep up with trends. (1)
Don’t believe me? Take a look at the hottest “new” areas of tech AI/machine learning, blockchain, and big data/data science. What barriers make it difficult for programmers to get into them and be successful? For machine learning the answer is a bit of linear algebra and multivariable calculus, evidenced by the many blog post promising to get readers up to speed. This math is (should be) covered by every computer science degree and has remained the same for at least 50 years, down to the presentations and illustrations used to teach it. Neural networks themselves have been around for a long time, although less mainstream.
For blockchain the essentials are an understanding of cryptography, and peer-to-peer networking. That doesn’t even mean fancy understandings of elliptic curves or number theory, just a solid grasp of hashes, signatures, and asymetric key encrpytion. Distributed systems is also a mature field of computer science.
Data science might be the most accessible. A solid understanding of introductory probablity and statistics and databases combined with a few tools such as linear regression, and polynomial interpolation might be enough for 90% of applications. But it’s going to be really hard, if you can’t understand a wikipedia page about polynomials.
I don’t mean to suggest that these skills make an expert. General theory is a longshot from research or novel contributions to the field. Nor is it sufficient to be a good programmer; learning linear algebra does not immediately make you good at writing machine learning programs (and scientists write some terrible code!). Rather this is the “hard stuff” that prevents programmers from getting into these fields. Once you know it, the other details are approachable. (2)
If you analyze other software advances from the past, from database theory to graphics, you will find similar applications of rather unextraordinary math and science. Many programmers ask, how I can I predict what skills will be important in the future? What do I need to learn to have a successful career? Few can predict what specific trends will take off, but I bet whatever is important in the future is going to require understanding those broad areas of computer science. Keep practicing and specializing in the area you work in, but if you regularly refresh and broaden your base of fundamentals you will be prepared to learn anything new that starts to look interesting. (3)
Understanding fundamentals gives programmers another significant advantage. They know what problems have been solved before. It’s unlikely you will remember every detail, but can say “I’ve heard of this before” and know where to learn more. Consider how many new tools are bad solutions to problems solved by simple bash scripts, of which the author was ignorant of. This is just a tiny represenative of how much duplication and complexity programmers are adding because they don’t know whats already there.
It may sound as though I am advocating a kind of tech hipsterism; everything interesting has already been done so we might as well stop looking for new things. Rather I am arguing that we will be able make more advancements, if we better understand the big ideas behind what has come before.
Properly understood this flips the progress narrative of technology on it’s head. We don’t have to chase headlines and blog posts about the latest frameworks and build tools. Nor do we have to guess which technologies might suddenly become useful, like day trading stocks. That’s for evangelists and IT consultants. The foundations for our next ideas has have already been built by generations of smart people. It’s all written down waiting to be dusted off and rediscovered. Rather than building and encyclopediac knowledge of novelties and press releases, we continuosuly revisit the same core subjects, over and over, in pursuit of mastery.
It doesn’t have to come from a university, and a majority of those who study STEM seem to lose it after going through the motions.
A great example of this is the popular book SICP. In just a few hundred pages it covers several major areas of CS and more than most programmers understand in their whole career. It’s able to do that because it’s not written for a general audience. It relies on the reader having a solid undergraduate understanding of another area of math or science as an MIT student would have.
Experts in a very particular system (for example OpenCL or V8 internals) can be extremely valuable. But, most of them only get that level of depth with a solid understanding of fundamental computer science. Expertise is also a subject for another day.