feature article
Subscribe Now

IoT Needs Better B&Bs

While top-flight bed & breakfasts would no doubt do a world of good for many IoT developers, the “B&B” in the title refers to BANDWIDTH and BATTERIES.  Given all the ink spilled on IoT, these are two topics that do not receive the attention they deserve.  The third important yet underserved topic is IoT security, and that will get a separate article of its own.

IoT bandwidth falls into the growing category of “challenges that need to be solved, and the sooner the better.”  Many IoT devices rely on Bluetooth (BT), which will work until it doesn’t and that point is rapidly approaching.  BT was invented and has evolved as a reasonable solution for a personal area network (PAN). The prime use model is your mobile phone and earpiece, heart-rate monitor, fitness band, cycling cadence-speed sensor, smartwatch, and the like.

  • Bad news: BT operates in the same unlicensed, very crowded 2.4 GHz band as many Wi-Fi devices (more on the less crowded 5 GHz band shortly) and many cordless phones
  • Good news: BT is an adaptive frequency-hopping spread spectrum technology that helps to cope with congestion in the very crowded 2.4 GHz band
  • Bad news: the BT MAC protocol is strictly master-slave, with no provision for slave-to-slave communication.  Everything on a Bluetooth piconet has to go through the master, which creates a bottleneck
  • More bad news: each piconet is limited to eight devices
  • Good news: several piconets can operate in close proximity
  • Bad news: communication between piconets is clumsy at best and impractical at worst, requiring a device to jump back-and-forth as a master on one piconet and a slave on the other

Bottom line: Bluetooth is suitable for PAN applications by design, but it is handicapped in most IoT applications for the above (and other) reasons.  This is not at all surprising, given that BT development began in the early 1990s, before IoT was anywhere on the landscape. 

That brings us to Wi-Fi, which IS hyphenated according to the Wi-Fi alliance (and they ought to know).  Wi-Fi is unfortunately NOT an adaptive frequency-hopping spread spectrum technology.  There are but three non-overlapping channels available in the very crowded 2.4 GHz band, which is why Netflix freezes if like me [a] your television operates only in the 2.4 GHz band and [b] you don’t live on two isolated acres in Woodside.  And heaven help you if one of your neighbors enables channel bonding—then essentially you’re hosed.

Wi-Fi has 24 non-overlapping channels in the 5GHz band, which sounds very nice until you realize that the latest incarnation (802.11ac) was BUILT for channel bonding.  And not just pairs of channels, no sir—QUADs of channels. 

  • Good news: 802.11ac supports non-contiguous channel bonding
  • Bad news: the math STILL works out to be as few as six (160 MHz) channels
  • Good news: 802.11ac supports leading-edge MU-MIMO technology (same as LTE) to potentially provide greater-than-Shannon data throughput
  • Bad news: bleeding edge MU-MIMO technology requires bleeding-edge DSP on BOTH the transmitter and receiver.  “That’s the coolest technology. How can you call that bad news?”  Because, in the sense of power consumption, all that DSP is anything BUT cool.

Bottom line: Wi-Fi is suitable for WLAN applications by design, but even the latest 802.11ac incarnation creates new issues as it solves old challenges.  Given the intended applications, here again it is not at all surprising that Wi-Fi is far from ideal for IoT applications.

Earlier this year at CES, I commented that the major product players followed a “throw pasta against the wall and see what sticks” approach.  (IoT toothbrushes?  Seriously?  From MULTIPLE companies?)  That applies in the semiconductor space equally well, where all of the major IoT device suppliers have decided to address the shortcomings of Bluetooth and Wi-Fi by jamming both Bluetooth and Wi-Fi radios into a single chip.  I can see the product planning meetings:

“Well, Bluetooth has its limitations and Wi-Fi has its own limitations.  So, let’s put both of them into one chip and let the customer figure out how to navigate the Venn diagram of limitations!”

I would be happy to blame the above on marketing, but honestly, I think engineering may have come up with the Bluetooth plus Wi-Fi idea.  In any case, I am disappointed by the lack of creative thinking.

IoT requirements differ from PAN requirements.  IoT requirements differ from WLAN requirements.  And I hope I don’t end up having to re-learn set theory to prove this beyond a reasonable doubt: IoT requirements differ from the union of PAN and WLAN requirements. 

I have developed a thesis with regards to the re-use/re-birth of previously unsuccessful technologies: many of today’s most vexing challenges can be solved by existing solutions innovatively applied.  I am confident that we can meet the unique bandwidth challenges of IoT in this manner.  We need to collectively recognize that Bluetooth and Wi-Fi are adequate for the applications for which they were designed, but they were NOT designed for IoT applications.

The second ‘B’ in “B&Bs” is batteries; not strictly batteries—rather the entire issue of energy efficiency from power consumption to power supply.  Astute readers will no doubt recognize that the battery issues are closely intertwined with the bandwidth issues discussed above, as radio power is a significant fraction of overall device power in many IoT applications.

Segueing from bandwidth to batteries, consider:

  • Bluetooth is thought of as a low-power technology, yet when one calculates data transmission efficiency (nJ/bit) surprisingly, BT comes out as relatively inefficient
  • Bluetooth-LE (now officially Bluetooth Smart) has “low energy” built right into the name, yet BT-LE data transmission efficiency is (even more) surprisingly even less efficient than BT
  • Conventional Wi-Fi (802.11n for example) has better transmission efficiency than either of the Bluetooths (or is that Blueteeth?)

So in addition to the bandwidth issues, we need to recognize that, today, the three prominent radios leave us wanting vis-à-vis energy efficiency:

  • Bluetooth-LE has acceptable absolute power requirements ONLY at very low datarates; think Kbit/sec baud rates from the good-old analog modem days
  • Bluetooth has acceptable absolute power requirements up to 10s of Kbits/sec.
  • Wi-Fi is the better choice—from an absolute power perspective—at higher datarates.

This is not an indictment of BT and BT-LE, as their power profiles generally work in the applications they were designed to serve: audio (BT) and low-datarate sensors (BT-LE) in a personal area network.  And the various Wi-Fi power profiles work in most wireless local area network applications.  Once again, the issue is that IoT requirements are different from PAN and WLAN requirements.  Blank sheets of paper and innovative thinking are in order, not re-packaged existing radio solutions.

Before I circle back around to power efficiency, I want to start at the source – the power source, that is:

  • Energy harvesting is a fertile (pun intended) area that may prove to be a boon in IoT.  The piezoelectric effect, for example, can be harnessed to generate modest amounts of power … so long as the device in question accelerates and decelerates frequently enough.  (Which could lead to some interesting conversations: “Bob, the box stopped transmitting; give it a dozen good kicks, will you?”)
  • Battery technology MUST improve (I took just the two semesters of chemistry, so far be it from me to contribute here).  It seems that our battery technology is something like an order of magnitude better than Alexander Graham Bell’s.  Meanwhile, my mobile phone has far more computing power than the entire Apollo program.  We’re not asking the battery folks to get on a Moore’s Law treadmill; just please, someone double or quadruple battery energy density and we will be happy.
  • Before anyone brings up battery alternatives, I for one REALLY don’t want to talk about tiny fuel cells.  (Talk about interesting conversations: “Bob, climb that tall pole and refill the liquid hydrogen in the transmitter.”)

There is a lot of work going on in the power source space, to be sure, and I certainly hope we see meaningful improvements in the not-too-distant future.   In any case, IoT developers ought not to hold our collective breath for a solution.

That brings us back to power efficiency.  The usual toolbox of power-saving tricks is already being deployed: from clock-tree optimization to dynamic frequency to aggressive function-level power management and low-power memories.

Please do NOT expect much help from bleeding-edge process technology.  Why?

  1. Have you seen the cost of a 14nm FinFET mask set?  It doesn’t amortize well across IoT production volumes.
  2. Have you seen mixed-signal capability on 14nm FinFET?  So much for a single-chip solution for the foreseeable future. 

Is there low-hanging fruit?  Certainly in displays, though there seems to be a bias toward sensors in the broad universe of IoT.  Thankfully we do see products with displays (pun not intended), yet they tend to be fairly inefficient from a power perspective.  I am surprised that E Ink technology has not been adopted in IoT (outside of the Pebble smartwatch), as it is spectacularly power efficient.  It is not as eye-popping as an AMOLED display, but E Ink can deliver dramatically longer battery life. 

Energy efficiency is clearly a tough nut, one that will need to be cracked incrementally from many different angles.  As I’ve touched on time and again, IoT is an emerging space, and the applications have new and different characteristics.  The current products appear to be using all of the existing energy efficiency tools.

What we NEED is to look at the characteristics of IoT applications and (novel or existing) tailored technologies.  I expect to see best practices emerge, develop and evolve into IoT-specific solutions that yield dramatic improvements in energy efficiency.  I expect to see significant gains to the point that many devices will achieve useful battery life equal to useful product life, making battery charging and replacement unnecessary.  In summary, I expect to see fresh, new thinking to address the unique bandwidth and battery challenges in IoT.

 

About the Author: 

Bruce Kleinman is a senior technology/business executive and principal at FSVadvisors and blogs on fromsiliconvalley.com

Leave a Reply

featured blogs
Dec 19, 2024
Explore Concurrent Multiprotocol and examine the distinctions between CMP single channel, CMP with concurrent listening, and CMP with BLE Dynamic Multiprotocol....
Dec 24, 2024
Going to the supermarket? If so, you need to watch this video on 'Why the Other Line is Likely to Move Faster' (a.k.a. 'Queuing Theory for the Holiday Season')....

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured chalk talk

Reliability: Basics & Grades
Reliability is cornerstone to all electronic designs today, but how reliability is implemented and determined can vary widely by different market segments. In this episode of Chalk Talk, Amelia Dalton and Sam Accardo from the YAGEO Group explore the definition of reliability for electronic components, investigate the different grades of reliability offered by the YAGEO Group and the various steps that the YAGEO Group is taking to ensure the greatest reliability of their components.
Aug 15, 2024
53,494 views