I have variable i of type int which value is 129. I have played with various representations of this variable in gdb.
# Decimal format of i
(gdb) p/d i
$18 = 129
# Binary format of i
(gdb) p/t i
$19 = 10000001
# Address of variable i
(gdb) p &i
$20 = (int *) 0xbffff320
# Binary format displayed at one byte
(gdb) x /tb &i
0xbffff320: 10000001
# Decimal format displayed at four bytes (one word)
(gdb) x /dw &i
0xbffff320: 129
# Decimal format displayed at one byte
(gdb) x /db &i
0xbffff320: -127
The above output is probably because of 10000001
in twos complement is equivalent of -127
according to this page if I am wrong please correct me.
According to this:
# Size of int in bytes:
(gdb) p sizeof(int)
$22 = 4
I know that int consumes 4 bytes in computer memory. So if my understanding is correct then number resides on some address in memory and consumes 4 bytes (or 32 bits, or 1 word or 1/2 giant words). Then the representation of this number look like this:
AAAAAAAA: XXXXXXXX XXXXXXXX XXXXXXXX XXXXXXXX
where
AAAAAAAA is the location in memory and
XXXXXXXX is the bits of that number (i have divided this bits into four octets for better understanding)
So when I access int number I need to know it's address and how many bits it consumes. So int consumes 32 bits and address can be obtained using &
operator. Here is my actual int representation in memory (BTW why does two differs is it related to endian setting on system or what?):
# Binary format displayed as one word (4 bytes) (I have put space between group of 8 bits for better understanding)
(gdb) x /tw &i
0xbffff320: 00000000 00000000 00000000 10000001
# Binary format displayed as four bytes
(gdb) x/4tb &i
0xbffff320: 10000001 00000000 00000000 00000000
Now a little bit of basic math:
0xbffff320: 00000000 00000000 00000000 10000001
AAAAAAAA: XXXXXXXX XXXXXXXX XXXXXXXX XXXXXXXX
AAAAAAAA: 8bits 8bits 8bits 8bits
8*3=24
(24)DEC == (0x18)HEX
0xbffff320 + 0x18 = 0xBFFFF338
0xBFFFF338 Should be the address of my last octet. So why this gives me 11001100
instead of 10000001
?
(gdb) x/tb 0xBFFFF338
0xbffff338: 11001100
I am printing one byte. If I were printing the whole int which consumes 4 bytes (I dont't thing this is even possible in my case since I do not have variable name which corresponds to this memory but you get the point) it may be some strange numbers because I am accessing memory after the declared variables and there might be some garbage and also my 10000001 would reside in most significant octet, but why now?
EDIT: According to suggestions I've added 3 bytes and not 24 bytes as it was before but the result is still wrong:
(3)DEC == (3)HEX
0xbffff320 + 0x3 = 0xbffff323
(gdb) x/tb 0xbffff323
0xbffff323: 00000000
Still does not return 10000001
what is wrong here?