It might be a simple solution but I can not fix it.
I am dividing 2 integers :
finishedGameFinalScore = [score integerValue];
CGFloat interval = 2/finishedGameFinalScore;
NSLog(@"interval = %f",interval);
The log returns 0.000000
Is there a limit for decimal places? I need to preserve the decimal result.
Thanks Shani
just devide long value from 1.0 and assigned to float variable.
The reason your code doesn't work is that you're dividing an integer by another integer and then casting the result to a float.
So you have 2 (an integer) and some other number (also an integer). Then you divide 2 by this number - which is probably greater than 2. Let's say it's 3.
Integer
sees 2/3 and he's like "0.66666667? Pshh, no one ever needs anything after the decimal point anyway". So he truncates it. You just have 0.Then
Integer
gives the number to Mr.float
and Mrfloat
is super happy to get a number! He's all like "yay, a 0! I'm going to add ALL OF THE SIGNIFICANT DIGITS". And that's how you end up with 0.0000000.So yeah, just cast to a float first. Or even a double!
Just add the f-hint to the number 2. in this case that will do the trick.
all the above/below answers are correct and fully explain why this work.
@Dustin said u will need to typecast your divider value to float as it goes in float it shows integer value
CASE 1: Typecast
CASE 2: No need for typecast