Sensor sensibility: NSF-supported Alsaleem developing power-saving tech


Scott Schrage, September 11, 2019

Sensor sensibility: NSF-supported Alsaleem developing power-saving tech

If Fadi Alsaleem’s next three years go to plan, the technology he develops should have the power to help preserve life — first in batteries, then the golden years.

The University of Nebraska–Lincoln engineer understands better than most how artificial intelligence, machine learning and advanced algorithms have put the “smart” in smartphones, smartwatches and smart homes. But he also understands the cost of such processing power, which itself demands ample electric power.

As devices continue getting smaller yet more sophisticated, he says, something has to give: space, functionality or, often, battery life.

“We have many sensors in our devices, and each of those sensors is using tons of data,” said Alsaleem, assistant professor of architectural engineering and construction at Nebraska. “And not just your phone. Everything now is at least kind of smart: your car, the refrigerator. They produce a ton of information.”

With an insect-inspired approach and more than $500,000 from the National Science Foundation, Alsaleem and colleagues in Texas now think they can lower power consumption by as much as 100-fold in wearable technology that detects falls among the elderly.

And in Alsaleem’s mind, that’s just the beginning.

MEMS the word

Alsaleem’s approach represents an extension of the technological field he’s tilled for more than 10 years: micro-electro-mechanical systems, or MEMS. True to their name, MEMS are microscopic systems that marry sensors or other mechanical parts with electronic components, usually microprocessors, designed to make sense of and act on the data collected by the former.

In the mid-2000s, Alsaleem developed a microscopic switch whose movement can signal when it exceeds a certain acceleration or deceleration threshold — a useful ability in deploying airbags during crashes. That switch was responsible only for alerting a CPU when a given threshold was surpassed. In that way, it behaved as virtually all MEMS-related mechanical components have to this point: sensing changes in state or environment but leaving computational decision-making to processors.

Then, just a couple of years ago, Alsaleem began toying with an idea that made his fellow engineers blink: Could mechanical sensors take on some of the processing burden, too? Offloading even some of that basic responsibility onto mechanical components, he thought, could also reduce overall power consumption.

“When I was talking at conferences, people would think I’m crazy — that this was very far away,” Alsaleem said.

Despite the initial skepticism, Alsaleem has proposed a plan that might just be crazy enough to work.

The new system incorporates a switch design similar to the one he previously developed. This time, though, Alsaleem and his colleagues will be constructing a series of paired switches to detect more complex movement profiles.

Collectively, the system will behave somewhat like the neural network of a brain, tracking both the activation of switches and the sequence of those activations throughout the network. Differential equations and machine learning will help train the system to distinguish falling onto a floor from, say, plopping onto a couch or easing onto a mattress. Measuring the duration of acceleration, along with the motion occurring before and after an abrupt increase, should help in making those distinctions.

But whereas a brain processes the signals sent from nerves — a model for the conventional but power-intensive setup of CPUs and sensors — the team is designing its mechanical sensors to effectively act as both nerve and brain. That makes the team’s approach more akin to the nervous system of many insects, Alsaleem said. Though insects have relatively few neurons and comparatively small brains, they evolved to perform surprisingly complex computations by essentially embedding neurons in appendages or other sensory organs.

Alsaleem, alongside colleagues Roozbeh Jafari of Texas A&M University and Siavash Pourkamali from the University of Texas at Dallas, is aiming to do the same.

“For the first time, we’re moving this data processing to the sensor level,” Alsaleem said. “Rather than the sensor giving you (just) raw information like acceleration or temperature, it can also give you higher-level information.

“This work is about enabling many of the emerging technologies, such as wearable devices, to perform very complex algorithms while consuming remarkably low power.”

The inspiration for applying the potential breakthrough to a fall-detecting device struck Alsaleem after a conversation with a colleague in his department. One out of four American adults aged 65 and older will suffer a fall, according to the Centers for Disease Control and Prevention, making it the leading cause of fatal and non-fatal injury among older adults.

“He was talking about his dad, who is fairly old,” Alsaleem said. “And he was saying that the Apple Watch has this nice new feature that can detect falls, but his dad isn’t using it for that. I asked him why. He said, ‘Because the watch needs to be charged each day, and my dad forgets.’

“So even though his dad has this watch, it’s of no use to him. (My colleague) was saying, ‘I wonder if there’s any new technology that can do the job without the need for my dad to keep charging it.’ And that’s where things clicked for me.”

Alsaleem also sees potential for the sensor-as-processor innovation to find a place in all sorts of other technological domains. As an architectural engineer, he’s already considering how to incorporate it in wearable devices that would read, process and transmit physiological indicators of discomfort to a smart home’s HVAC system, which could then heat or cool a specific room accordingly. And he envisions it reducing the number of bulk-adding processors needed in soft robotics, which often feature an array of sensors and nearly unlimited range of movement that together put serious computational strain on existing systems.

“The application for this is very wide,” Alsaleem said. “Anything that relies on data to make some conclusion or judgment or decision, that’s where this fits — regardless of the size, regardless of the location.

“MEMs have been used as sensors everywhere, so everybody’s familiar with that. What we’re doing is just adding a small layer (for) computing. That’s the beauty of our idea.”

Computing Durham School of Architectural Engineering and Construction Engineering