NSDecimalNumber accuracy and compare
In the following test code the assert fails
NSDecimalNumber *num1 = [[NSDecimalNumber alloc] initWithInteger:1];
NSDecimalNumber *num2 = [[NSDecimalNumber alloc] initWithInteger:6];
NSDecimalNumber *num3 = [[NSDecimalNumber alloc] initWithInteger:3];
NSDecimalNumber *result1 = [num1 decimalNumberByDividingBy:num2]; // 1 ÷ 6
NSDecimalNumber *result2 = [result1 decimalNumberByMultiplyingBy:num3]; //
1 ÷ 6 x 3 = 0.5
NSDecimalNumber *num4 = [[NSDecimalNumber alloc] initWithFloat:0.5];
NSAssert([result2 compare:num4] == NSOrderedSame, @"Math error");
I understand this is to do with floating point accuracy, 1 ÷ 6 will
introduce the issue. The result2 is
0.49999999999999999999999999999999999998 instead of 0.5.
I am trying to validate that two math expressions give the same result.
Should I not use compare for this purpose but instead allow some small
fractional difference between the two NSDecimalNumbers? Or perhaps I
should use the decimalNumberByDividingBy:withBehavior: method and do some
rounding? I am not sure how to best proceed. Ideally result 2 would be 0.5
and I wouldn't need to hack the compare.
No comments:
Post a Comment