Could someone explain me what is going on within the Contiki-OS when it transmits an UDP packet?
Here is the current consumption of my device in details running with the CC2538 chip:
My question is: why it takes so long to transmit an UDP broadcast packet (about 250ms) knowing that theoretically at 250kbps the packet of 408 bits length should be transmitted in approximately 2ms? I'd understand if the transmission last lets say ten milliseconds but here the difference is huge.
I use the example in contiki/examples/ipv6/simple-udp-rpl/broadcast-example.c
Does anyone have an idea?
By default, Contiki uses ContikiMAC radio duty cycling (RDC) protocol. The protocol has to deal with two conflicting requirements: allow receiver nodes to sleep almost all of the time when there are no packets to receive, but at the same time allow to deliver data as reliably as possible. The solution adopted in ContikiMAC is to place the burden on the transmitter. Given that receiver checks the radio channel 8 times per second (the default configuration on cc2538dk platform), the transmitter has to transmit for at least 125 ms duration to be sure that the receiver has waken up and seen the packet. In practice, this means that a packet is retransmitted for multiple times in row. See the ContikiMAC paper and Contiki documentation for more detailed description.
That being said, you won't always see transmissions with the maximum duration. If its an unicast, the receiver normally sends an ACK after successful reception. The transmitter checks for this ACK, and stops transmitting if received. This way, the expected average number of transmissions required is reduced two times. And then there's also Phase Optimization - it allows the sender to synchronize the start of transmission with the expected wakeup time of the receiver. But for broadcasts, no ACKs are generated and the phase optimization wont work.
Another possible reason for unexpectedly long transmissions is a failing CCA check. Before transmitting a packet, the radio stack first checks if the medium is free; if it's not, it will back up for some time and retry.
I found the problem : the radio is not turned off properly after the transmission of a packet.
At the end of the function transmit()
in the file cpu/cc2538/dev/cc2538-rf.c
the radio is turned off only if it was previously off.
if(rf_flags & WAS_OFF) {
rf_flags &= ~WAS_OFF;
off();
}
But actually the program never goes in this condition and the radio is not turned off immediately after the transmission of a packet.
The problem arises because the function channel_clear()
(called at the beginning of the transmit()
function) clears this flag first. Thus the function transmit()
doesn't know anymore that the radio was off before its execution and therefore the radio is kept on.
To fix the problem I put a local variable inside the channel_clear()
which turn off the radio and clear the flag only if it is turned on inside the function itself.
static int
channel_clear(void)
{
int cca;
/* Fix: local variable */
uint8_t intern_onoff;
intern_onoff = 0;
PRINTF("RF: CCA\n");
/* If we are off, turn on first */
if((REG(RFCORE_XREG_FSMSTAT0) & RFCORE_XREG_FSMSTAT0_FSM_FFCTRL_STATE) == 0) {
rf_flags |= WAS_OFF;
on();
intern_onoff = 1;
}
/* Wait on RSSI_VALID */
while((REG(RFCORE_XREG_RSSISTAT) & RFCORE_XREG_RSSISTAT_RSSI_VALID) == 0);
if(REG(RFCORE_XREG_FSMSTAT1) & RFCORE_XREG_FSMSTAT1_CCA) {
cca = CC2538_RF_CCA_CLEAR;
} else {
cca = CC2538_RF_CCA_BUSY;
}
/* If we were off, turn back off */
if((rf_flags & WAS_OFF) == WAS_OFF && intern_onoff) {
rf_flags &= ~WAS_OFF;
off();
intern_onoff = 0;
}
return cca;
}
The current consumption during a packet transmission looks like now to:
Note: the strobe time was reduced intentionally to 10ms with:
#define STROBE_TIME RTIMER_ARCH_SECOND / 100
This explain why there is only three strobes of transmission for the broadcast message.
The duration of a strobe is 3ms. Which means that the data rate is ~140kbps (?).