Bryan J. Smith wrote:
On Wed, 2006-05-24 at 13:13 -0300, Alvaro Kuolas wrote:
GPUs are important today, they will be more in the future. It's irrational how the market is turning, we need the same control over the GPU as we have over the CPU.
But the problem is that GPUs are _not_ like CPUs!
It's not like you just have a base CPU ISA add a few extensions every few years. Each and _every_ GPU is tuned with additional library and software performance.
It's as if CPUs came out with a radically _new_ design every 9-12 months that was _slow_ if you just used the same ISA from 9-12 months earlier.
That's GPUs in a nutshell.
Yes it's true, like a new CPU architecture on your AGP. But they don't change so fast. nvidia has the TNT architecture from, well, the TNT trough the GeForce2. The GeForce3 was new, because of the vertex programs and pixel shaders. GeForce4 was a "power up" GeForce3. The new arch. came with the GeForce FX, it's said that was a VLIW architecture... a short lived one. Now it's the NV4x cards and G70 IS a NV4x. For ATI, it's the same. Radeon was R100, Radeon 8500... R200, Radeon 9700... R300 and R4xx. How many years the R300/R4xx architecture lasted? Now here we need to define the act of buying a hardware. What is buying hardware? For me the idea is: I bought a Hardware, not a service. More if it's programmable, I want to use it at it's full extent. If not so, it's a service, I buy it to do it's job in a lock down way. In reality computers are not like cars. Too many patents really hurts economy. And why, if they only want 12 months of head start, a patent grants to a Hardware/Software based model 20 years of ownership?