Plenty of devices are running on power all the time, even when we are not using them. Sometimes all they do is monitor for the user to press a button or interact with the devices, but many modern devices maintain network connectivity. This allows them to wait for some user interaction which may come through a network connection from a computer, phone or other device. Some of these devices are doing so with low power consumption, some are quite wasteful. Here is a pair of contradicting examples:
Good example: network connected printer
My printer for the last ~9 years has been a network connected laser printer. A reasonably simple one, which uses a cable connection, prints in black and white, and has a large capacity toner cartridge. Thanks to many things going paperless, the factory provided 1500 page toner lasted me for 7 years. Nevertheless, throughout this time the printer has been a silent, efficient and (mostly) quick responding device. It’s just there in a corner, sitting silently and ready to wake up when someone clicks print on any of the computers around. It quickly integrated with the work computers when corona pushed for forking from home.
See the LEDs: the manufacturer even made the effort to turn off the LEDs for the network card when the printer goes into lower power mode. The connection is on, but it saves a bit more electricity without the LEDs ON.
And here is the cool thing: in the idle state, this whole device, turned on and connected to the network, constantly waiting for a command consumes just 1.1W of power, about 4 euro of electricity per year at current prices. This power is split between the network card, the processor (which is likely in some low power mode), the power LED which remain on and the switching mode power supply. This is good, it is little enough, and there is likely not much more it can be reduced. If this printer would be built with more modern components, the power consumption can be probably halved, if the manufacturer does not cheap out. Overall, i think this is a good balance.
Bad example: robotic vacuum
My robotic vacuum needs about 43 minutes to vacuum and sweep the whole apartment. After doing what i consider a pretty good job, it consumes about 20Wh (20W for 1h) to charge back. While it is idle and waiting for a command, it consumes 2.5W. All the time 24/7/365.
The “hole” in the graphic is when the vacuum cleaner did the vacuuming, the tower mountain is the energy used to charge the battery. The wide, flat “plain” is the standby energy.
So in a day where it vacuums, it consumes 20Wh for the vacuum and 55 Wh on standby. On a day where it does not vacuum, that would be 60Wh on standby. With 2 vacuums per week, the total energy is 40Wh vacuuming and 410Wh of standby. So in a given week, about 10% of the energy is used to vacuum, 90% is used just waiting for the user to send a command. The 2.5W of standby power, over 5 years is quite a bit of money, 44 euros to be exact.
It may be just a tiny LED indicating there is power, but somewhere under it, the machine wastes a full 2.5W doing mostly nothing.
Here is what is bothering me: this thing wastes 90% of the energy it consumes just waiting for a command, and it should not. Just like the printer, when it is on standby it does not do anything else but maintain a network connection and wait for a message. But something else in it is burning power because likely the manufacturer considered that this can be acceptably low. Maybe it’s just the processor not put in a low power state, some sensors remaining on or other reasons.
This thing is so wasteful that I can just buy a remote control plug an program it to be on for the 4 hours per week when the vacuum is in use. But i like comfort and expect the manufacturers to take care of the efficiency by themselves.
I wish we could have a a law that says: any method or component that can save more money in energy than it costs to be implemented, should be implemented in a device.
This reminds me
Back in the day of CRT TVs (you know, those big, heavy, fat ones), turning your TV off from the remote only slightly reduced the power consumption. Why? the TV was still keeping most on the circuits and electronics on, including the generation of the high voltage required for the tube. They would just mute the audio amplifier and the image on the screen. It’s only when the One Watt initiative was introduced, when manufacturers were forced to reduce the standby consumption of many devices, including TVs. As usual, unless forced to save the users’ energy, manufacturers would just pick the cheapest way out for them, not the lowest cost of ownership way.
Oh come on, this is nothing
Apply the same logic over a few devices and the little nothings add up to significant power and money over long term. I am not asking for crème de la crème expensive technology here, I am asking for enough care from the manufacturer such that a useless energy cost is not passed on to the consumer. Remember this useless standby consumption is not only multiplied by 24 hours and 365 days and a few years, but also by the millions of devices that manufacturer sold.