Skip to content

Mozilla making the Web a gaming platform with Unreal 3 engine in a browser

March 29, 2013

Mozilla making the Web a gaming platform with Unreal 3 engine in a browser

One fairly natural usage is that network speeds are usually measured in the 1000-scale (10mbps, 100mbps, 1gbps, 10gbps for ethernet, but similar for other networking standards), whereas in-system stuff like file size and RAM are in 1024-scale. Granted, that’s 1000 bits, whereas things measured in bytes tend to use 1024-scale. This causes confusion when, for example, downloading a file. I know the file size measured in KB/MB, and the line speed measured in mbps. Not only do I have to account for bits vs bytes, I also have to account for the difference in unit "size" as well. This has a number of consequences: Less knowledgeable users become confused, and ultimately don’t know what to expect. This can lead to distrust of the involved gadget/system and/or people. Companies have a much easier time getting away with shenanigans, by playing off of the misunderstanding. Not just HD manufacturers, but also ISPs, RAM makers, vendors quoting disk space requirements, etc. Everyone can play fast-and-loose and be "safe" blaming it on this conversion. The margin of vagueness increases- it’s yet another layer of uncertainty to add on top of estimations, in the case that you don’t know with absolute certainty which scale was used in your raw data. Considering the two things interact with each other frequently, this is an inconvenient conversion to have to make. It’s not hard of course, it’s just annoying. But really, that’s kinda the whole point… any* consistent unit system will work, it might just be annoying in usage or conversion. This is in contrast to the whole Metric vs Imperial debate, because (IMO) those two rarely meet. It happens, but generally most people and countries pick one and stick with it. In the 1000 vs 1024 situation, you pretty much have to deal with both all the time. In the end, wouldn’t it be a heck of a lot easier all around if we all just standardized on bits (in 1000-scale) for everything? This solves the whole SI-prefix debate naturally (kilo/mega/giga being used properly), and eliminates the factor-of-8 converting bytes to bits for disk vs network. "This file is 100 megabits, my network is 1 gigabit/sec, it should take 0.1 seconds plus overhead to transfer". How awesome would that be if the only conversion you ever had to do was move the decimal place around?

Advertisements

From → Uncategorized

Comments are closed.

%d bloggers like this: