I have seen some similiar questions but I can't get this right:
I receive a 16 bit signed integer in an NSData object, e.g '0xFF3C'. How can I read out the correct number from this?
EDIT: With James Webster's answer I get:
Received hex value : <0800>
Converted value: 803995656
Received hex value : <0a00>
Converted value: 803995658
Received hex value : <1a00>
Converted value: 803995674
Received hex value : <faff>
Converted value: 804061178
Received hex value : <e4ff>
Converted value: 804061156
Received hex value : <f0ff>
Converted value: 804061168
which clearly is not correct (I expect +/- 2000)
EDIT: Some code:
-(void)characteristicValueRead:(CBCharacteristic *)characteristic{
unsigned short myInt = (int) [characteristic.value bytes];
NSLog(@"Received value: %@", characteristic.value);
NSLog(@"Converted value: %i",(int)myInt);
}
With this snippet I get:
Received value: <0000>
Converted value: 26176
Received value: <0000>
Converted value: 46688
Received value: <0000>
Converted value: 30576
Received value: <0000>
Converted value: 2656
Received value: <0000>
Converted value: 50896
Received value: <0000>
which looks really interesting/annoying. What could this come from?
Actually quite a difficult problem!
This was about the 20th thing I tried, and seems to work ok:
and the output is:
I never did that myself. However, this should work assuming your 2 byte data is in myData:
You may want to test wether myData.lenght really equals 2.
I found an easy way to do that, check if it work for you.
I did this:
And just change the length to what you need.