is the power consumption depending on the Wifi mode and the signal quality or is it always the same?
I mean of course, if the signal is bad, the Imp may require longer or multiple attemps to connect and so consume more power. But will the single spikes when tx also consume more power if the signal is bad?
Thanks for your help!
Basically this is related to wifi modulation scheme. The lower bitrate modulations (802.11b in particular) transmit at a higher power, and hence take more current. Generally you’ll fall back to 802.11b when you are out of range of 802.11g/n.
So: yes, you will see higher power consumption of transmits when signal strength is low. The exact point this happens will depend not just on the RX quality that the imp sees, but the AP in use, how high its transmit power is, and how good its receiver is.
The lower the bitrate, the longer the transmissions are too, obviously (ie it takes longer to send a given number of bits at 1Mbit/s than 65Mbit/s), so that’s a double hit.