It might be a simple solution but I can not fix it.
I am dividing 2 integers :
finishedGameFinalScore = [score integerValue];
CGFloat interval = 2/finishedGameFinalScore;
NSLog(@"interval = %f",interval);
The log returns 0.000000
Is there a limit for decimal places? I need to preserve the decimal result.
Thanks
Shani
The reason your code doesn't work is that you're dividing an integer by another integer and then casting the result to a float.
So you have 2 (an integer) and some other number (also an integer). Then you divide 2 by this number - which is probably greater than 2. Let's say it's 3.
Integer
sees 2/3 and he's like "0.66666667? Pshh, no one ever needs anything after the decimal point anyway". So he truncates it. You just have 0.
Then Integer
gives the number to Mr. float
and Mr float
is super happy to get a number! He's all like "yay, a 0! I'm going to add ALL OF THE SIGNIFICANT DIGITS". And that's how you end up with 0.0000000.
So yeah, just cast to a float first. Or even a double!
@Dustin said u will need to typecast your divider value to float as it goes in float it shows integer value
CASE 1: Typecast
NSString *score = @"3";
int interval = [str intValue];
CGFloat interval = (2/(float)interval);
NSLog(@"interval = %.2f",interval);
CASE 2: No need for typecast
NSString *score = @"3";
float interval = [str floatValue];
CGFloat interval = (2/interval);
NSLog(@"interval = %.2f",interval);
Just add the f-hint to the number 2. in this case that will do the trick.
CGFloat interval = 2.0f/finishedGameFinalScore;
all the above/below answers are correct and fully explain why this work.
just devide long value from 1.0 and assigned to float variable.
unsigned long l1 = 65536;
unsigned long l2 = 256;
float f = (l1/1.0)/(l2/1.0);
NSLog(@"%f",f);