[nSLUG] 1, 2 or many? -- how many cores will you want on your desktop?

Rich budman85 at eastlink.ca
Sat Mar 24 13:02:06 ADT 2007


On Fri, 2007-03-23 at 19:43 -0300, George N. White III wrote:
> A number of CS profs have been commenting on the difficulties of
> writing software for highly parallel hardware.   Here is an interesting
> discussion (by a Canadian author):
> 
> <http://www.hpcwire.com/hpc/1332461.html>
> 
> Others have noted that existing PC architectures don't properly support
> multiprocessors, so the answer may depend on what you can get with
> your processor (chipsets, memory architectures).
> 

Great article, thanks for sharing the site. :)

Until systems stop being designed on the cheap and maximum profit, we
will never see hardware at its best.  Unfortunately, money is money. :)

I wouldn't bother investing in a multi-processor board, just because the
software does not always handle it properly.  Just research some
articles on SMP and that should help you understand why we see
multi-core today.  

I've seen it at work, where single-core multi-processor machines are
rapidly being replaced by dual-core boards.  Is there a massive speed
increase?  hmm.. Because the machines are being replaced with their
equivalents, in my opinion, the multi-processors were faster.  The
application that pegs these machines to 100% CPU for 8 hours is to do
extreme forecasting of data. It uses parallel distributive processing, a
master machine that assigns and distributes load across many slave
machines.  In the initial runs, this application helped track down a few
kernel issues with scheduling that Red Hat helped address, and firmware
issues that IBM helped with fixes to it's microcode. 

Now the machines are running solid, however, is dual-core faster...
maybe if it allows us to pack more processors per machine, then I can
see where it will gain some speed.

Where I used to work, I had a friend that was a physicist in the R&D
department.  He has done a lot of work on optics with CD's and lasers,
and has proved negative light (hope I got that right) where it was once
thought that light stopped or zero'd out at a certain point (optimal
point for reading CD's).  We were discussing light one day, when he told
me about the limits of copper.  He said we won't see any major advances
in speed until copper is eliminated and only light is used. He said the
reason was because electricity is like water, the farther it has to go,
the more power it takes to push it that distance.  Copper is the
channelling system for the water.  He said the current trend of reducing
form factor will help, but again... copper is slowing us down.  So when
that day comes, when everything is optical, we will finally begin to see
unheard of processing speeds.


I agree with his analysis.  I hope I have recalled it correctly, it's
been 7 or 8 years ago. :)  



Regards,
Rich

!DSPAM:46054b8086913378611184!




More information about the nSLUG mailing list