I do not understand why whitening can improve your listen mode??
My sense that this might help comes from section 3.4.9 in the datasheet: "The RSSI sampling must occur during the reception of preamble in FSK, and constant “1” reception in OOK.".
With the listen scheme I think that condition cannot be satisfied with such short rx times. So the question becomes why should RSSI detection be done during preamble? Well I don't know exactly how RSSI detection works - but I do know that the preamble is white and it seems intuitive to me that it would help to have a signal that isn't nailed to one side of the spectrum. Thus the change. Admittedly not a very strong basis to justify a change that I can't really prove is useful .. anyway.
Regarding the different sync words, I could imagine that you have another traffic on the frequency which uses the same sync words like you had configured before? That could increase your current consumption because the rfm module may reacts for foreign transmissions?
Well that would be ok and expected. But what's unexpected is the reaction to a very specific signal that I generate myself: a burst sequence but targeted at another node. In that case I would expect listen mode to verify the criterion "RSSI detected" and keep RX on after the sub ms period. It should now look for packets until the 20ms timeout is over or payload is available. Since the burst is for another node no packet will match and the radio should go into idle after 20ms.
But what happens is it stays in RX for seconds - until the burst is over. Way longer than 20ms. This clearly is not consistent with the spec. Thus the workaround of running in promiscuous mode.
The minimum RX time in the listen mode is of course a function of your configured bitrate and the configured RX listen timeout time.
What bitrate and RX timeout time do you use?
Bitrate is Felix standard 55555. RX listen timeout is ~20ms. Given that I use "RSSI detected" as exit criterion the second part of your statement is actually incorrect. The minimum time in RX is exactly what I program it to be. The question is can the radio detect RSSI during that time? And how much time does the radio stay in RX if RSSI has been detected until timeout or payload available. In my last post I gave the minimum rx period that still works for RSSI detection at 55555baud.
RSSI detection would be quicker at higher bitrates - but I also want things to be reliable. I use 200kbit/s in my bootloader, but that also dynamically degrades to 55555bit/s / 19200bit/s. That seems difficult to do here.
I think we could decrease the current consumption more if we could get a PayloadReady flag after receiving messages, because otherwise the RFM will always stay in RX mode as long as the RX timeout is not exceed!
I don't know about your system - but in mine it's far more often the case that there is no packet for the listener than that there is. So I'm trying to optimize for that case with the very short duty cycle. My rx timeout is so long that it will always include one full packet. So if the signal comes from a gateway burst I will get PayloadReady and the radio is put into idle before timeout.
I don't understand how we could further reduce power consumption? Actually I'm happy with power consumption but I would love to get a shorter idle period. About 3 seconds burst time is a bit lame.