![](https://seccdn.libravatar.org/avatar/f71dc8e161926779170632048a753c69.jpg?s=120&d=mm&r=g)
On Sunday 20 November 2005 09:11, Anders Johansson wrote:
Darryl Gregorash wrote:
Sorry, I'm a physicist. A gigabyte is a billion (milliard in Europe) bytes, and that is 1 GB. When the international scientific community makes an announcement to the contrary, I'll buy into this, but I can assure you, "G" will always mean 10^9.
I thought it meant 9.8 ms^-2 :)
If programmers want to try to undo the mess they've made, let *them* create a new, different, notation for their concoction. Otherwise, they make no more sense than the mother who complained her son the brand-new soldier was not out of step, it was the rest of the army that was out of step
It doesn't really cause a problem in most cases, since just about everything in the world of computers is measured in powers of 2, the problem only arises because hard drive makers decided to be different. And I don't think they did it out of deference to ISO/SI, they did it for marketing reasons (hey, we get to call a 150GB drive 160GB!)
Actually, all line speeds, like bandwidth is measured in base 10. The fact is that computer technology is a hodge-podge of base 10 and base 2. The base 2 stuff is a legacy of memory where all the data was on on a binary tree. This does not have any bearing on hard drives, cpu speed, bus speed, etc. This confusing situation is why we now have Kib, MiB, GiB, Tib, etc. I already had a patch accepted for KDE 4.0 to follow this standard. So as of KDE 4.0, if it says GiB, you know it means base 2, and if it says GB, you know it means base 10. This is not a panacea due to really weird things like 1.44 MB floppies, but it is a step in the right direction. Mark