Darryl Gregorash wrote:
Sorry, I'm a physicist. A gigabyte is a billion (milliard in Europe) bytes, and that is 1 GB. When the international scientific community makes an announcement to the contrary, I'll buy into this, but I can assure you, "G" will always mean 10^9.
I thought it meant 9.8 ms^-2 :) No, that's "g"; "G" is 6.67*10^-11 Nm^2/kg^2. You just flunked Physics 101 :)
If programmers want to try to undo the mess they've made....
It doesn't really cause a problem in most cases, since just about everything in the world of computers is measured in powers of 2 Does that mean, if I win a one megabuck lottery, after I've deposited
On 11/20/2005 11:11 AM, Anders Johansson wrote: the cheque into the banks computers, I suddenly have an extra 48 thousand dollars? :)
, the problem only arises because hard drive makers decided to be different. And I don't think they did it out of deference to ISO/SI, they did it for marketing reasons (hey, we get to call a 150GB drive 160GB!) I am certain they didn't, but they still got it right.