Hans du Plooy wrote:
You are assuming that image quality has not improved at all in the last seven years, but it has.
What I'm assuming is that each of us has unique eyes, and that the OP may be unable to detect any improvement newer hardware offers, or find the detected difference does not justify the investment.
Entry level graphics cards these days are cheap, and even the cheapest that nVidia and ATi offer far superior image quality than any but the most expensive cards of seven years ago. Also, features like Antialiasing etc that cards can do in hardware these days, have to be passed off to the CPU on those older cards.
Not everyone has teenage eyes. Some of use can't detect any difference between 16 bit and 24 or 32 bit color. Antialiasing is pointless with larger fonts used with high resolution. New hardware may or may not provide a perceptible difference. Most of today's CPUs not used for gaming have more than ample reserve cycles for video, spending most of their time waiting on I/O.
The OP said they use graphics software, and buying a new LCD display. That I would think is good justification to get a graphics card that is a bit more up to date.
I believe testing your existing equipment with the new display before spending money for an improvement you might never perceive is a prudent plan. -- "I can do all things through Him who gives me strength." Philippians 4:13 NIV Team OS/2 ** Reg. Linux User #211409 Felix Miata *** http://members.ij.net/mrmazda/