Understanding Articles: A Wearable Device for Indoor Imminent Danger Detection and Avoidance With Region-Based Ground Segmentation
It may have promising applications for my research
The summary
A group of researchers have developed a wearable solution that helps the visually impaired to navigate unfamiliar and complex indoor rooms such as offices or even malls. They make use of a depth camera and and a small speaker that serves as an acoustic feedback to tell the user that there’s an object in a particular direction of their path
The general flow
The flow diagram shows the steps that the hole solution follows, there’s 5 main stages from.
- Image and IMU data acquisition
- Realtime orientation and Height estimation
- Region based ground segmentation
- Object detection
- 3D acoustic feedback
What Can I use for my research
If you’ve been following for some time, and also if you have good memory, because i did mention this like my first ever article. You might remember that I was inspired to make an AI model to detect anomalies such as fall detection for the elderly and disabled in indoor scenarios. Hopefully once I get fall detection right, I can share with you my journey with other signals such a heart rate. In any case I’d like to adapt this solution to not be a wearable and to be a stationary camera solely for fall detection, at least for now. So what is my idea?
Instead of using the depth camera as a wearable I will work with the assumption that the camera will be in a fixed place. Why? so I can test my fall detection and fall scenario solution with multiple datasets as well as my own constructed dataset.
What databases can use? the URFD which does have depth camera information, and the Fall detection dataset which are datasets with a big impact paper.
In the next couple of articles I’ll go part by part of what was covered in the paper with my own code and with the help of Chat GPT, why? because I’m working and studying at the same time, and I don’t have money for an assistant. Ok, that’s it for now. Thanks for reading :)
References
[1] https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9211506