> Explain why consumers would even care. The largest consumer
> cameras are 8MP, which is about 2,800 pixels per side.
Do you understand the concept of stitching together images?
A guy at work had an 8 MP camera a few years ago. He has since
upgraded.
> Install 1GB of memory, and disk swapping for consumer apps
> almost disappears, even when you are running multiple apps.
> This seems to be a concept you keep on trying to steer away
> from, as you bring up more far fetched examples using your
> engineering office friends.
I have two gigabytes of memory on my system and no page file but I certainly use the memory. Disk swapping for consumer apps may
disapper but paging doesn't. Windows XP has the concept of Page Stealing which came from VMS where it pages memory to disk that isn't frequently used.
What does it do with that memory? Well it uses it as a disk cache but you have to touch the files to get the file into the cache.
Now I've seen plenty of discussions on notebook forums specifically dealing with hardware upgrades on the antivirus
scan issue and the general consensus is that you're better off
with a better disk; not more CPU horsepower.
> If you read it and acknowledge it, then why would you ask if
> consumers need more horse power in Word or Excel?
Because there are more users of Word and Excel than of the other applications.
> The above are all real world, popular, CPU intensive
> applications that need a performance boost more than anything
> else. Yet you are glossing over these while insisting on the
> performance of stitching together 50k x 50k pixel images.
> There are probably only a hand full of people in the world
> who need to do that, and they probably work for NASA and
> Google.
Seems to be pretty common in the Mac World from the articles
that I've read. But which of those consumer applications that
you mentioned do you run? I do work with a lot of large image
files but have used none of the software save for WinZip in
the list that you mentioned.
> I've worked with foil sets of several hundred slides chock
> full of images, sound, animation, and video, and I have
> little problem running it on my 1.6GHz Banias based Centrino
> with 1GB of memory. At its slowest, the presentation lags
> between slides for a split second, but other than that, most
> operations are instantaneous.
There's instantaneous and instantaneous. My image rendering code can decode about 60 MB in about three seconds and I can
usually tell my code from someone elses.
But if single-core performance is fine, why bother with dual-core? Unless there won't be any single-core machines. Part of my problem with the push to dual-core is that the vast majority of users don't need it. Why not just wait until 64-bits is out the door?
> The benchmarks were done in 32-bit, because 64-bit binaries
> don't exist for any of them, which is one of my big points.
> Multithreaded software is here, while 64-bit software isn't!
Software writers write to the biggest market and that's 32-bits at the moment. But it will be 64-bits and you lock yourself out of the performance and future feature improvements of that software by sticking with an obsolete platform.
> How much is an 80GB Ramdisk? I am guessing prohibitively
> expensive.
Irrelevent. You asked me to find a scenario and I did. Don't
ask engineers to come up with something that you think can't
be done.