The many-core future
Jul. 2nd, 2008 12:56 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
If you're not already following it, I commend today's post in Ars Technica about the upcoming changes to hardware. It's not precisely new, but it does underline what I've been saying for the past couple of years: the time of Massively Multicore is upon us.
Everybody's getting used to having two or maybe even eight cores in a computer, and you can almost kind of ignore that in most cases -- after all, you're just writing one process among several on the machine, so if you're limited to one core it's not a big deal. You might even tweak the program to use a few threads. But Intel is now talking seriously about architectures that range from dozens to *thousands* of little cores working together. You can't ignore that if you're going to be doing any kind of serious programming.
There's a message here, and it's an important one if you're in the field: if you're not already good at multi-threaded programming, you need to *get* good at it. There are probably deep changes to programming coming soon -- they're hard to predict at this point, but the root message is that you're going to need to understand threading pretty well. Or you're going to need to learn the languages that are inherently threadsafe, like Erlang. Or both. If not, you risk confining yourself to the limited subset of the field where threading can be ignored. (Eg, the simpler forms of web-server programming, but not the really interesting bits.)
It's exciting stuff -- we may be looking at the most dramatic changes to the *art* of programming (rather than just its uses) in decades. But if you are seriously in this field, you need to be paying attention to it, and making sure your skills are up-to-date, or you risk dead-ending your career...
Everybody's getting used to having two or maybe even eight cores in a computer, and you can almost kind of ignore that in most cases -- after all, you're just writing one process among several on the machine, so if you're limited to one core it's not a big deal. You might even tweak the program to use a few threads. But Intel is now talking seriously about architectures that range from dozens to *thousands* of little cores working together. You can't ignore that if you're going to be doing any kind of serious programming.
There's a message here, and it's an important one if you're in the field: if you're not already good at multi-threaded programming, you need to *get* good at it. There are probably deep changes to programming coming soon -- they're hard to predict at this point, but the root message is that you're going to need to understand threading pretty well. Or you're going to need to learn the languages that are inherently threadsafe, like Erlang. Or both. If not, you risk confining yourself to the limited subset of the field where threading can be ignored. (Eg, the simpler forms of web-server programming, but not the really interesting bits.)
It's exciting stuff -- we may be looking at the most dramatic changes to the *art* of programming (rather than just its uses) in decades. But if you are seriously in this field, you need to be paying attention to it, and making sure your skills are up-to-date, or you risk dead-ending your career...
(no subject)
Date: 2008-07-03 02:58 pm (UTC)So on the one hand, I do think that linear programming is going to become ever more problematic even in that space: people will continue to be more demanding in what those systems do, and current indications are that linear programs are never likely to run much faster than they do now. (Indeed, on the coming chips they may well run slower.) But linear programming may not be the best way to tackle decision-support problems *anyway*, and newer rule-based languages, which *will* scale well, are likely to become a more natural fit to the problem space as they mature.
(And yes, it'll probably take many years for the transition to happen. But we shouldn't forget the lesson of Y2K: when the changes come, they sometimes come with overwhelming speed, and sweep a lot of old code away rather suddenly. So rather than the old COBOL programmers losing their jobs gradually over the course of decades, many of them were put out of work almost overnight as their codebases went away...)