Entry tags:
The many-core future
If you're not already following it, I commend today's post in Ars Technica about the upcoming changes to hardware. It's not precisely new, but it does underline what I've been saying for the past couple of years: the time of Massively Multicore is upon us.
Everybody's getting used to having two or maybe even eight cores in a computer, and you can almost kind of ignore that in most cases -- after all, you're just writing one process among several on the machine, so if you're limited to one core it's not a big deal. You might even tweak the program to use a few threads. But Intel is now talking seriously about architectures that range from dozens to *thousands* of little cores working together. You can't ignore that if you're going to be doing any kind of serious programming.
There's a message here, and it's an important one if you're in the field: if you're not already good at multi-threaded programming, you need to *get* good at it. There are probably deep changes to programming coming soon -- they're hard to predict at this point, but the root message is that you're going to need to understand threading pretty well. Or you're going to need to learn the languages that are inherently threadsafe, like Erlang. Or both. If not, you risk confining yourself to the limited subset of the field where threading can be ignored. (Eg, the simpler forms of web-server programming, but not the really interesting bits.)
It's exciting stuff -- we may be looking at the most dramatic changes to the *art* of programming (rather than just its uses) in decades. But if you are seriously in this field, you need to be paying attention to it, and making sure your skills are up-to-date, or you risk dead-ending your career...
Everybody's getting used to having two or maybe even eight cores in a computer, and you can almost kind of ignore that in most cases -- after all, you're just writing one process among several on the machine, so if you're limited to one core it's not a big deal. You might even tweak the program to use a few threads. But Intel is now talking seriously about architectures that range from dozens to *thousands* of little cores working together. You can't ignore that if you're going to be doing any kind of serious programming.
There's a message here, and it's an important one if you're in the field: if you're not already good at multi-threaded programming, you need to *get* good at it. There are probably deep changes to programming coming soon -- they're hard to predict at this point, but the root message is that you're going to need to understand threading pretty well. Or you're going to need to learn the languages that are inherently threadsafe, like Erlang. Or both. If not, you risk confining yourself to the limited subset of the field where threading can be ignored. (Eg, the simpler forms of web-server programming, but not the really interesting bits.)
It's exciting stuff -- we may be looking at the most dramatic changes to the *art* of programming (rather than just its uses) in decades. But if you are seriously in this field, you need to be paying attention to it, and making sure your skills are up-to-date, or you risk dead-ending your career...
no subject
Edit: in the graphics, scientific and engineering field, I get it, but what about the business applications side?
no subject
Consider: most business apps nowadays are *still* pushing pretty hard at the CPU. (As is often pointed out, Word today runs no faster on today's hardware than Wordstar did on my Z80 25 years ago.) Not every moment, and not for every function, but all sorts of functionality is winding up needing heavy resources.
Will there be *aspects* of business apps that can be done single-threaded? Probably. But frankly, I think that even that will be going away -- most of those sorts of applications will be replaced with higher-level views, rather than today's simple languages. For instance, while I judge Windows Workflow Foundation a fairly mediocre first pass, I suspect that it is a sign of where business programming is going in the medium-term: to high-level event-driven systems that don't quite look like modern programs, so that they can scale...
no subject
The only reason office automation tools (such as word processors) run as fast as they did 20+ years ago is because of code bloat. 8^) It is like "stuff". It expands to fill all available space....
However, I was actually thinking more along the lines of decision support systems and other back office applications. I can see a lot of potential for supporting e-Commerce systems and for speeding up queries into data warehousing systems, but trying to understand how to apply this to specifiying requirements for application design is escaping me at the moment. Of course, we'd have to make a good case for the business to upgrade to the new hardware in the first place. There are still plenty of linear programs running the world.
no subject
So on the one hand, I do think that linear programming is going to become ever more problematic even in that space: people will continue to be more demanding in what those systems do, and current indications are that linear programs are never likely to run much faster than they do now. (Indeed, on the coming chips they may well run slower.) But linear programming may not be the best way to tackle decision-support problems *anyway*, and newer rule-based languages, which *will* scale well, are likely to become a more natural fit to the problem space as they mature.
(And yes, it'll probably take many years for the transition to happen. But we shouldn't forget the lesson of Y2K: when the changes come, they sometimes come with overwhelming speed, and sweep a lot of old code away rather suddenly. So rather than the old COBOL programmers losing their jobs gradually over the course of decades, many of them were put out of work almost overnight as their codebases went away...)