Powering wearables (and giving batteries a better life)
2 mins read
A fresh batch of growth predictions on wearables has once again highlighted the (wrist) burning question of power usage.
Around 22million wearable devices will be shipped this year — more than double last year's 9.7m. The market will grow from $5billion in 2013 to $30bn by 2018, companies will sell more than 112m devices by 2018. And on they go; big numbers, big money.
However, the power/performance profile for wearables remains much lower than those for smartphones because wearables, being small, can only use small batteries. Bigger batteries mean bigger devices, and that's not what the consumer wants. A majority of the wearables and smart watches already on the market have a battery life that lasts, at best, a couple of days. And the prevailing thinking is that this isn't enough for consumers and will limit uptake.
Therefore there is intense interest in reducing battery size, increasing battery efficiency and generally finding ways to power these soon to be billions of portable electronic devices.
Peak battery power
Batteries are lagging. Battery progression tends to be slow and it is estimated that the battery industry is improving power capacity by just 8% a year, which is painfully small when compared to the rate of innovation seen in semiconductors and the electronics industry as a whole.
The predominantly used lithium-ion phone battery has reached its peak of efficiency. Graphite has a theoretical limit to how much lithium it can hold, and today's batteries have pretty much reached it — or it might be more accurate to say that graphite isn't the limiting factor in batteries, but rather the cathode is [The current predominant cathode is a LiCoO2 derivative (LCO being the original cathode used by Sony when LIB first came to market in 1991) which has a capacity of about 140 mAh/g. Graphite has a capacity of approximately 300mAh/g, and Si is about 4000mAh/g. Improvements may be made to the anode but the cathode will still limit performance].
Li-ion is generally considered a good trade off between efficiency and safety. More energy can be gleaned by using more exotic metals or, say hydrogen fuel cells but they tend to be toxic, corrosive, explosive and not likely to be allowed in public much less on a person's body!
Better batteries are coming; every battery manufacturer on the planet is being harassed by wearable/phone people every day. There have been many promising lab-based developments involving hydrogen, ultra-capacitors, nanotube, graphene/silicon and exotic metals, but none of these looks to be commercially viable inside of 3-5 years. Battery-less approaches utilising energy harvesting techniques also looks to be a very promising area but the efficiencies are not yet sufficient to power a wearable application for a sufficient length of time.
What it boils down to is that if you want to make a battery last longer, you need to make it bigger.
So, what will power the oncoming generation of wearables seems unlikely to be a new generation of batteries. Instead we think it'll be a change in the architecture and power usage of ICs to make them more efficient.
To that end, the utilisation of sub-threshold design techniques is in essence equivalent to increasing the size of a battery by an order of magnitude or more. Consumers demand that batteries have a better life. Depending on the device, it means weeks, rather than days; months rather than weeks; years rather than months. That's the promise of sub-threshold design.