I have watched how the internet has gone wild with the introduction of the ESP8266, 2.4ghz wifi modules. They are (dirt chinese) cheap, have lots of output power, were hailed as the new backbone and de-facto standard of the so called “Internet Of Things” with their “long range”, high wifi bitrates, no need for a gateway, simple AT command set, and are promoted as the what-else-could-you-ask-for by the big wigs of the online IoT/hacker community. There has been loads of development pouring into these chips. You can upload to them directly from your beloved Arduino IDE, they have loads of memory compared to an Arduino. You can run a webserver on them and control things with their GPIO, great. Look, if you hack around a little you can even sleep them at 78uA, so low power – amazing. I even bought a few to try out for myself.
Some reality-check and practical use reveals they are ideal in some cases and not so much in others. If you want a quick snap wifi enabled device that blinks a led or whatever at 54mbps (million blinks per second) or pushes temperature directly into the cloud, they will make great projects for a bunch of permanently powered nodes. The biggest problem I see is their not so low power consumption. I don’t care how deep you can put it to sleep, the wifi protocol will never be low power enough, not even close (neither bluetooth or BLE which also won’t have the range). The wakeup and pairing of wifi alone can be several seconds of power hungry consumption. The ESP8266’s current peaks can be in excess of 200mA at max TX power and we know that even short current peaks drain a battery much faster than small longer current peaks. All this matters when you are on a limited power budget or a small battery that won’t take power spikes very well.
If you spend any amount of time reading application notes and technical documentation about wireless RF technologies from companies like TI or others who have excellent knowledge bases, you will immediately get a sense of how imperative the need to reduce transmission and reception current. They invest massive amounts of R&D to decrease any kind of current and the time spent in receiving and transmitting. Spikes are especially regarded as undesirable culprits. When they reach the limits of squeezing that least bit of uptime they invest in creating ways of reducing the duty cycling of reception and transmission by different modulation schemes so that reception can be had at a lower mean consumption rate, and reception rate is overall more successful (selectivity, blocking and immunity means more accurate reception and immunity to the ever increasing RF noise and consequently requires less transmissions which saves more current). In the wifi world we can’t really talk about that because the protocol was not specifically created for low power applications.
Let’s talk range and why you’d ever want to go with sub 1-Ghz radios. We know that the lower the bandwidth and bitrate the higher the sensitivity and consequently longer the range (it’s like a very narrow piece of the spectrum where the radios spell out each word, the slower the spelling the lesser the chance the receiver will miss or missunderstand any part of the message). Also the lower the frequency bands the higher penetration of the medium, it’s just physics. I think the following movie from TI really summarizes it all and why Sub 1-Ghz technology will always be superior in several ways (warning: sound blast intro):
Anyone remember CC3000? What ever happened to the other wifi radios like CC3000 series which were the ex de-facto IoT about 2 years ago? Yup they were expensive and the ESP8266 with it’s ridiculous $2 price tag totally killed it right there. Actually I just did a quick search for ESP8266 and the prices I see are more like $7-15 depending where you shop, add an Arduino and you’re not saving for that lunch burger any more. To get the $2-3 price you have to buy from china direct (ebay, alibaba etc) so if you want to wait about a month and buy a pairing $2 arduino while you’re there, you can still save enough to enjoy a yummy burger.
Anyway, as I always have since their introduction, for some reason I cannot seem to be impressed no matter how many cool ESP powered projects I read about, and I keep wanting to find an application where they would be a better match than sub-ghz, but I have yet to find one. My humble Moteinos (right, which are not made in china and don’t cost $2 and won’t save you a burger so they are more healthy 🙂 ) still work for anything and everything wireless/remote control applications. I don’t need 54mbps for turning on a light, checking my sump pump, opening my garage or reporting motion. Neither do utilities and industrial integrators who always seem to prefer sub ghz technology for their meters and applications.
For those who feel this was unfair or inaccurate and want to start a debate please note I did not meant to compare Moteino to ESP8266, they are completely different in every way.