April 13, 2017
For many people with limited mobility, alerting others or getting help after a fall is a serious concern. Syracuse University Professor Senem Velipasalar began looking at the issue in 2011, and realized the camera-based monitoring systems at the time were using static, wall-mounted cameras installed at fixed locations.
“Existing camera-based person monitoring systems were using static cameras installed in different rooms. They were watching you,” said Velipasalar. “The monitoring was confined to where the cameras were installed.”
Velipasalar and her two doctoral students started working on a new way to monitor not only falls but also other types of activities.
“What if we used images from a wearable camera?” asked Velipasalar. The camera would face towards the scene, and capture images of surroundings. This not only extends the monitoring to wherever you may travel, but also alleviates the privacy concerns of subjects.
Initially, the team experimented with a battery-powered unit containing a camera and a microprocessor. The prototype was successful, but as smart phones added more computing power and incorporated different types of sensors, such as accelerometers and cameras, Velipasalar and her students shifted their concept.
“Our phones have everything we need,” said Velipasalar. “These things were not possible a decade ago, but now you can run sophisticated algorithms on mobile devices.”
In Velipasalar’s lab, the team developed an algorithm that can use data from a smart phone’s built-in accelerometer and images from a smart phone’s built-in camera, and process this data on the phone itself. When worn on a belt unit, a student researcher demonstrated how it worked.
Velipasalar and her students found fall detection systems that only relied on accelerometer data had high rates of false positives. Being in a fast moving vehicle, an elevator, or bumping a piece of furniture would often register as a fall. When images from the camera of a phone attached to a belt were processed, and fused with the results from accelerometer data, false alerts were reduced.
The team also started looking at other potential uses for the new system, such as footstep counting and traveled distance calculation. They tested their wearable system against other step counting applications running on smart phones or smart watches. Comparisons showed that accelerometer-based apps had an average of 21% error in step counting, and 46 to 52% error in traveled distance calculation. Their approach of combining camera and accelerometer data reduced these error rates to 3.75% and 3.48%, respectively.
On February 14th, 2017 Velipasalar, Akhan Almagambetov (MS 2011, Ph.D. 2013) and Mauricio Casares (Ph.D. 2014) were granted a patent for “Automatic detection by a wearable camera.”
Velipasalar believes the technology could be advanced in the future to where a smart phone could interpret what the camera is seeing.
“It has many application areas. You could take this system and put it on a ground robot or on an unmanned aerial vehicle,” said Velipasalar. “Depending on where you place the system, it can be used for different purposes including activity monitoring, driver assistance, autonomous navigation, surveillance or infrastructure inspection.”