this is a very basic question i know, but still
all my calculations are done on the basis of conversion that whenever i am converting number of bytes into kilobytes, and then into megabytes, i am using the converstion ratio of 1024. this is a conversion i use to calculate various delays, rates etc.
now the same calculations are being done somewhere using 1000 instead of 1024. this has resulted in minor discrepencies in my results as it does not match with others as they use 1000 instead of 1024. now i am having heated discussions with several others regarding the validity of my calculations!!!
so what is correct 1024 bytes = 1 kilobyte or 1000 bytes = 1 kilobyte?
I guess I'd go along with 1024 any day of the week ! Well, for one it'd b the statictically exact way of calculating delay because tat is wat all networking hardware is designed to understand n work with .... also to second it .... using 1000 to approximate such calculations when working with milliseconds of routing delay & microseconds of switching delay ... isn't I guess the best way to get the right numbers, cuz watever u do the number is never an exact estimate of wat the hardware is expected to do ... as per design tht is.
However, if u wanna hv an easy way of doing the maths in ur head to get an approximate head count of watever data is there 2 b read .... 1000 is far more easier ... but only if u dont bother abt how ur calculations exactly impact all there is .... n this may vary from case to case. I guess both u n ur friends r right in ur own ways ... jus how much u r concerned abt statistics n how these influence ur end result is wat makes u choose wat u do
thanks DaLight, that's exactly the definition i had shown the people who are in favour of using 1000 instead of 1024 as conversion unit. i had a hard time convincing them that in matters of delays, bit rates, speeds, etc, in networking, we can only arrive at correct results using 1024 and 1000 will only give us approximate, and as the values get multiplied or divided against one another, the level of inaccuracy increases by using 1000 instead of 1024
That's the definition I was going to propose to. You can only use 1000 when speaking or doing a very quick calculation. Using 1000 instead of 1024 when making crucial calculations is unacceptable.
It's the old scientific chestnut of using the correct units isn't it? We're dealing with kilobytes and megabytes as opposed to thousands and millions of any other quantity. And a byte historically is eight bits, so you're automatically constrained to think in a number base other than ten if you want to be accurate. Calculations using 1000 can only be relied on as approximations, if you want the real answer it's got to be 1024. Or am I just a pedant?