jducoeur: (Default)
jducoeur ([personal profile] jducoeur) wrote2008-07-02 12:56 pm
Entry tags:

The many-core future

If you're not already following it, I commend today's post in Ars Technica about the upcoming changes to hardware. It's not precisely new, but it does underline what I've been saying for the past couple of years: the time of Massively Multicore is upon us.

Everybody's getting used to having two or maybe even eight cores in a computer, and you can almost kind of ignore that in most cases -- after all, you're just writing one process among several on the machine, so if you're limited to one core it's not a big deal. You might even tweak the program to use a few threads. But Intel is now talking seriously about architectures that range from dozens to *thousands* of little cores working together. You can't ignore that if you're going to be doing any kind of serious programming.

There's a message here, and it's an important one if you're in the field: if you're not already good at multi-threaded programming, you need to *get* good at it. There are probably deep changes to programming coming soon -- they're hard to predict at this point, but the root message is that you're going to need to understand threading pretty well. Or you're going to need to learn the languages that are inherently threadsafe, like Erlang. Or both. If not, you risk confining yourself to the limited subset of the field where threading can be ignored. (Eg, the simpler forms of web-server programming, but not the really interesting bits.)

It's exciting stuff -- we may be looking at the most dramatic changes to the *art* of programming (rather than just its uses) in decades. But if you are seriously in this field, you need to be paying attention to it, and making sure your skills are up-to-date, or you risk dead-ending your career...

[identity profile] dragonazure.livejournal.com 2008-07-02 05:27 pm (UTC)(link)
I've moved out of the programming field, but I am curious as to what the ramifications of this are for application-level programming and design.

Edit: in the graphics, scientific and engineering field, I get it, but what about the business applications side?
Edited 2008-07-02 17:29 (UTC)

[identity profile] dragonazure.livejournal.com 2008-07-03 12:55 pm (UTC)(link)
As I said, I've moved out of the s/w development area and away from the scientific computing area....8^( A long time ago, I did some work on parallel processing, but wasn't able to continue that line of research beyond graduate school--so the Ars Technica article was of interest. Today, my job entails the design and support of business systems.

The only reason office automation tools (such as word processors) run as fast as they did 20+ years ago is because of code bloat. 8^) It is like "stuff". It expands to fill all available space....

However, I was actually thinking more along the lines of decision support systems and other back office applications. I can see a lot of potential for supporting e-Commerce systems and for speeding up queries into data warehousing systems, but trying to understand how to apply this to specifiying requirements for application design is escaping me at the moment. Of course, we'd have to make a good case for the business to upgrade to the new hardware in the first place. There are still plenty of linear programs running the world.