In one of my blogs a person asked me, “Can you teach a person to be a programmer within 6 months?” I answered, “I can make a programmer out of any person within two weeks, but there is a chance that he’ll be asking questions like this: http://m.hotpot.hk/story.php?id=15689”.
I shared the above link with our software developers in the Skype chat. Some people laughed. One person responded with a popular link to a presentation that makes fun of JavaScript:
Another guy responded with this question:
var a=0.1 a=a+a+a (a - 0.3== 0) // false or true ?
After years on Wall Street, this was an easy one, “Of course, false!” Floating numbers precision makes the results unpredictable. We use BigDecimal. I’ve created a little fiddle for you. Just follow this link and press Run to see for yourself: http://jsfiddle.net/4nwdv/
For those who after running this fiddle say “WTF!”, here are the some details – I ran the same code in JavaScript console in Chrome Developers Tool:
So Google Chrome’s Java Script engine truly believes that
0.1 + 0.1 + 0.1 = 0.30000000000000004
Maybe if you’ll run it in Firefox, the result will be different? Nope, I ran it in Firebug’s JavaScript console, which confirmed, that 0.1 + 0.1 + 0.1 = 0.30000000000000004.
By now, only the person who forgot to take his morning pill wouldn’t agree that this is a language problem and JavaScript is bad. What’s good then? This is another easy question: “Java and only Java!” Most of the Wall Street applications are written in Java and do the number crunching real well! Let’s see what will be the result of the same arithmetics in Java. I wrote this little program, ran it in the debugger and put a breakpoint right after the variable got the new value. Man, the result is the same as in JavaScript!
Just to complete the program I pressed the green button Resume to see the result of a-0.3 on the console. Well, it’s not exactly what I was expected to see, but pretty damn close, isn’t it?
This little experiment shows that the demand in software developers will only be increasing, because while regular Joe believes that (0.1 + 0.1 + 0.1) – 0.3 = 0, the savvy software developer would not be so sure cause it depends…
I’d appreciate if you’d run the same tests in other programming languages and share your findings. Together we can make the world a better (or at least more definitive) place!
Why JS and Java give the same result?
Easy: IEEE floating point standard (754 if IIRC).
Any language using IEEE doubles must give the same result.
Floating point maths was always fun thing, with notable example of Pentium FDIV bug.
Let’s try something more mundane and trustworthy, like integer addition?
Consider following code:
int i = 5;
i = i++ + ++i;
print(i);
C++: http://ideone.com/N55I4i
JavaScript: http://jsfiddle.net/EN8un/
Testing on different compilers yielded results ranging from 11 to 20.
because I have a colleague whose name is Yakov 🙂
CYGWIN_NT-6.1-WOW64 1.7.17(0.262/5/3) 2012-10-19 14:39 i686 Cygwin
int main()
{
{
cout << "TestArythmetic ———-" << endl;
double a = 0.1;
a = a + a + a;
cout << "a – 0.3 = " << a – 0.3 <>> a = 0.1;
>>> a = a + a + a;
>>> print ‘a – 0.3 = {0}’.format(a – 0.3)
a – 0.3 = 5.55111512313e-17
AS3:
http://www.evernote.com/shard/s44/sh/3431e0d2-ef95-480b-b526-f4311eedfc61/d9b07ae47c3a69ac9bc1e9e155a50b25
This is another example in JS:
> x = [4,8,15,16,32,42]
[4, 8, 15, 16, 32, 42]
> x.sort()
[15, 16, 32, 4, 42, 8]
Alex please read this doc https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/sort