Can someone please answer for me why this assertion fails
XCTAssertEqualWithAccuracy (1.56, 1.57, .01, @"");
while this one works
XCTAssertEqualWithAccuracy (1.56, 1.57, .02, @"");
I would think that 1.56 is +- .01 away from 1.57, so it shouldn't fail.
You thought is wrong. You think in decimal numbers but use floating-point. In floating-point arithmetic 1.56 is more than 0.01 away from 1.57, because none of these numbers can be represented accurately. Try this:
#define printf(...) CFShow([NSString stringWithFormat:@""__VA_ARGS__]) // makes it look like C
printf("1.56 = %.20f", 1.56);
printf("1.57 = %.20f", 1.57);
printf("0.01 = %.20f", 0.01);
printf("1.57 - 1.56 = %.20f", 1.57 - 1.56);
printf("(1.57 - 1.56) - 0.01 = %.20f", (1.57 - 1.56) - 0.01);
then marvel at your console which shows
1.56 = 1.56000000000000005329
1.57 = 1.57000000000000006217
0.01 = 0.01000000000000000021
1.57 - 1.56 = 0.01000000000000000888
(1.57 - 1.56) - 0.01 = 0.00000000000000000867
If you wonder why, just google "floating point", find many excellent explanations, and read up on one of the basics of computing.