My Photo

August 2017

Sun Mon Tue Wed Thu Fri Sat
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31    


« Fractal branching ultra-dexterous robots | Main | Autochthonous Life »

February 15, 2005


Anonymous Coward

Is there a biological barrier to 64-bit? I mean, will I, a normal computer user, need 64-bit? Or is 64-bit a domain of server applications? Yeah, I know it'll make Photoshop run faster, but do I need that extra speed? Who would want/need 64-bit ?


AC, that's a good question, one that was asked by users running 16-bit systems shortly after the advent of 32-bit. Years from now, you'll ask yourself why anybody would still run 32-bit.


I don't think the move to 64-bit will be as much of an advantage for the common user as the jump from 16-bit to 32-bit. The main advantage of 64-bit is being able to address more than 4 gigabytes of memory without performing some serious swapping or workarounds. The leap to 64-bit is orders of magnitude larger than from 16 to 32-bit.

Most of the applications that need to address that much memory (databases, scientific computing, video processing, high resolution image manipulation, application servers) can already take advantage of that on high end servers and workstations. Alpha has been around for a long time, Sun is still around, and now you can pick up a PowerMac G5 or an Athlon 64 and setup a cheap little 64-bit box with Linux or Solaris. When the 64-bit chips start becoming common in your average Joe's PC, they're not going to notice much difference at all, and possibly even a decrease in performance due to the doubled size of the pointers.

By the time I can order a computer with 4+ GB RAM for under $1000, I'm sure solitaire will be able to fully take advantage of that extra memory. As the old saying goes, "any program will expand to fill available memory." Or as JWZ said, "Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can." Developers will always find a new way to use/waste memory.

My guess is this will lead to higher level languages replacing more and more old low level code. Applications will be able to be developed faster and at the same time be more maintainable. I think this may have something to do with your post "What happened to OS research?". As you stated, operating systems have become mostly stable, providing a solid base for the next step. The majority of innovation is being done at a higher level, in the application layer. The research that was being done on operating systems has moved to language design and automatic memory management. Managed code allows developers to work on the actual function of an application, rather than needing to control every tiny little detail. This process of building on the stable foundation will only lead us to higher and higher levels of abstraction. Base components will continue to get minor little tweaks or the occasional reimplementation, but the majority of new design will be on the upper layers tossing around these components.


Am I correct in thinking memory is miles behind processor technology, and this will affect 64bit processors as well. I'm thinking of L1,L2 caches, although I could be getting it completely wrong!


David, well-stated analysis. I would only argue that 64-bit will enable a whole new class of user mode applications that are at this point in time unknown. You are entirely correct that the most obvious short-term benefactors of 64-bit will be server applications like SQL and Oracle, which are already running on 64 bit today in the high end machines, as you mention. There will be low-end powerful machines too when consumer grade 64-bit computers become widely available.

Let's also not forget that there will be marked performance and stability improvements in certain classes of current 32-bit client applications as well when they are ported to 64-bit.

Typically, software is generally way ahead of hardware and those of us in the software world often find ourselves limited by what hardware enables for us. With the advent of 64-bit, software has a wide open field to play with, much of it being unexplored...

It will be really interesting to see what new ideas and subsequent breeds of desktop applications will arise from 64-bit possibility. It's quite compelling.


Chris, if you had mentioned hard drives as a limiting factor, I would definitely agree with you. You could argue that the size of the on chip caches is lagging, but I don't think the speed is a problem. The biggest single bottleneck in most computers is disk I/O. Disk access takes orders of magnitude longer than a simple memory access. Larger caches, or even more layers in the cache hierarchy such as L3, can lessen the impact of memory accesses and RAM misses, but the caches have to get loaded at some point from disk, usually on the first cache/RAM miss. Most hard drives now include caches of their own to provide another buffer in the memory hierarchy. I/O will become even more of a bottleneck as multicore CPUs and multiprocessor systems become even more commonplace, as each processor fights over the available resources.

The main problem with hard disks is that they are approaching the physical limits of density and how fast they can reliably operate before physics kicks in and you end up with scattered bits from a head crash.

The move to 64-bit processors will make room for more solid-state RAM drives. I/O bound applications would benefit from a solid-state RAM filesystem such as non-transactional databases like datawarehouses (transactional databases need to persist the data and logs in case of failure), application compilation (reading/writing to many generated files), and high resoultion video editing.

Eric boutin

I can't believe it.. really

You guys are really talking about "what will be the advantage of 64 bits computing"
your juste 10 years late. Sun, SGI and Nintendo saw the 64 bit world as an advantage for multiple years and are continuing so.
While research on 128 bits processors are begginings, you are wondering what's the advantage a 10 years old technology is bringing to us

And Drive I/O isn't a factor at all; a fc-al array in raid 5 can produce the hell of an I/O

James Dickens

Well your question is only 10 years too late. The 64 bit CPU was introduced more than 10 years ago. The MIPS 4000 family as well as the Ultra SPARC CPU's are both mainstream CPU’s that have been available for more than 10 years now. While full fledged virtualization was not available in 64 bit, but has been available for a few years now in 32 bit mode and for most people it merely gives them the ability to run multiple Operating systems on one machine.

For the majority of people the jump to 64 bit mode will mean little since the 32 bit processor architecture already provided 4GB of ram. While some OS's provide this per task other lesser OS's only provided a shared 4GB for all tasks. Perhaps the real advancement would be having the mainstream OS that better utilize the processing power already available to them. Instead of moving to a larger 64 bit architecture that includes all the extra bloat of doubling the size of all pointers.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.


Post a comment

Comments are moderated, and will not appear until the author has approved them.

Your Information

(Name is required. Email address will not be displayed with the comment.)