Tuesday, September 29, 2009

Hardware Nerds Are Hot by Sam Blackman

Whether we knew it or not, we’ve all been relying on something called “Moore’s Law.” Back in the 1960s, Intel (INTC) co-founder Gordon Moore noticed that the number of transistors that could cheaply be placed on an integrated circuit had been doubling every two years.

That meant that central processing units, or CPUs — the chips that drive computer performance — were getting twice as fast in that same time period. That amazing rate of technological change has held up for more than 40 years.

Moore’s Law is why we take it for granted that the cell phone we carry around today is more powerful (and cost us less) than the top-of-the-line desktop computer we bought ten years ago. It is also why we’re not surprised that in less than a decade the Web has changed from a place to look at ugly text pages to a place to watch high-definition TV shows.

But after 40 years, Moore’s Law is slowing down. We’ve finally reached the point where faster processors consume too much power, and manufacturing them to achieve ever-higher frequencies gets to be too expensive. This technological pressure will radically reshape the way we build computers and write software in the years to come.

Going forward, computers will get faster by adding additional processors that work together to solve problems. That’s why we hear more these days about the number of cores in the CPU rather than how fast the processor is in our computer. Giants like Intel and Nvidia (NVDA) are racing to create new “massively parallel solutions,” composed of as many as 240 individual processors designed to work in concert to solve problems.

Unfortunately, writing software that runs well on massively parallel systems is incredibly difficult. Engineers need to figure out how to break big problems down into smaller pieces that individual processors can work on at the same time, how to keep all of the individual processors coordinated with each other, and how to assemble all of the work into a useful output.

At the recent Hot Chips microprocessor design conference in Palo Alto, Calif., John Hennessey, the president of Stanford University, called parallel computing “the hardest problem in computer science.”

To date engineers have only solved a small set of problems using parallel systems, and it’s not for lack of trying. Microsoft (MSFT) and Intel think that figuring out parallel computing is so important that they’ve invested $20 million funding parallel computing research centers at the University of California Berkeley and the University of Illinois at Urbana-Champaign.

Difficult or not, the future of computing is going to be on massively parallel systems. Some savvy companies are already taking advantage of massively parallel systems to trade stocks, search for oil, and offer online video games. At Elemental Technologies, we’re building software to help professionals process video files faster and more economically than ever before.

To build the kind of team that can take advantage of these massively parallel systems, software companies are going to have to rethink the mix of engineers that they are hiring. They will need people with experience in hardware design and low-level “close to the metal” programming. Engineers who understand how these new massively parallel architectures work, and know how to parallelize problems. Today, programmers with these skills are in seriously short supply.

There’s a pool of great engineers who don’t even realize that their future is working for software companies yet, though. They’re the digital hardware engineers who have spent their career working for chip companies and startups working on things like embedded systems and integrated circuits – where the parallel processing paradigm has been in use for years, since that is the way physical devices work. The smartest software companies will snap up as many of these engineers as soon as possible in the next few years and put them to work building software that can take advantage of the computers of the future.

Companies that don’t harness this resource will find themselves disrupted by faster, cheaper, and smarter software from competitors who did.

No comments: