C# double addition - strange behaviour

2019-01-27 06:38发布

问题:

public static void Main()
{
    Dictionary<string, double> values = new Dictionary<string, double>();
    values.Add("a", 0.002);
    values.Add("b", 0.003);
    values.Add("c", 0.012);

    // Summing iteratively.
    double v1 = 615.0;
    foreach (KeyValuePair<string, double> kp in values)
    {
        v1 += kp.Value;
    }

    Console.WriteLine(v1);

    // Summing using the Sum method.
    double v2 = 615.0;
    v2 += values.Values.Sum();

    Console.WriteLine(v2);

    Console.ReadLine();
}

When I look at the value of v1 in the debugger it gives a value of 615.01699999999994 but for v2 it gives a value of 615.017. For some reason the Sum method yields an accurate result whereas summing them iteratively does not. (When I print the two values they are the same, but I presume this is due to some rounding that the WriteLine method does.)

Anyone know what is going on here?

回答1:

Floating point math is inherently not 100% accurate and has error. The order in which you add different numbers together can affect how much floating point error there is. If it's important for these calculations to be completely accurate you should use decimal, not double.

This doesn't have anything to do with using Sum vs. manually summing the data. In the first you add each number to 615 as you go, in the second you add all of the numbers to each other an then add them to 615. It's a different ordering of adding the same data. Depending on which numbers you use either method could potentially result in more or less error.



回答2:

The problem with double/float's is that they are binary numbers, aka 1000110.10101001 internally, and therefore only represent an approximate representation of your values.

Read Jon Skeet's explanation: Difference between decimal, float and double in .NET?