I was testing some of my code, in javascript I added .1+.2 and it gives me .30000000000000004 instead of .3 . I don't understand this. But when I added .1+.3 it gives me .4. I googled it and find its something about Double Precision addition. But I don't know what it is.
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
Here's the obligatory link: What Every Computer Scientist Should Know About Floating-Point Arithmetic
Basically, there are many base 10 numbers that cannot be exactly represented in the floating point format used by most computers, so you'll get issues like the ones you highlight.
回答2:
If you can't stay awake for What Every Computer Scientist Should Know About Floating-Point Arithmetic, try instead the javascript-specific Rounding in JavaScript.
回答3:
Floating point numbers have a finite amount of precision, as the number is stored in a finite number of bits.
The number you are trying to store can't be stored accurately, so an approximation is used.
What Every Computer Scientist Should Know About Floating-Point Arithmetic .