Darryl Gregorash wrote:
Sorry, I'm a physicist. A gigabyte is a billion (milliard in Europe) bytes, and that is 1 GB. When the international scientific community makes an announcement to the contrary, I'll buy into this, but I can assure you, "G" will always mean 10^9.
I thought it meant 9.8 ms^-2 :)
If programmers want to try to undo the mess they've made, let *them* create a new, different, notation for their concoction. Otherwise, they make no more sense than the mother who complained her son the brand-new soldier was not out of step, it was the rest of the army that was out of step
It doesn't really cause a problem in most cases, since just about everything in the world of computers is measured in powers of 2, the problem only arises because hard drive makers decided to be different. And I don't think they did it out of deference to ISO/SI, they did it for marketing reasons (hey, we get to call a 150GB drive 160GB!)