top of page
Our son Julian has cerebral palsy and will soon be undergoing surgery to his left leg. They will rotate and lengthen his femur, lower his kneecap and release both his calf and hamstring muscles. The post-op rehabilitation for this majorly invasive surgery will be substantial.
To avoid any surgical intervention, Julian has done physio almost every day of his life for 14 years. I’m proud to say, that until this teenage growth spurt, he has worked so hard to avoid any surgery whatsoever. But due to his recent growth, he has been losing the battle.
In preparation for the rehabilitation period of the surgery I had started work on a AR project that I’m hoping will help him learn to walk properly again after surgery.
The project is a VR headset. An external camera in the headset displays the surroundings to him so he can see where he is walking. Superimposed over that video is an animated representation of the precise movement of his feet. So, as he walks he will see exactly how his feet are moving through space in real-time. The ultimate goal of this is to allow him to walk, during his post-op rehabilitation, without having to imagine where his feet are…because he will see them as they move!
How it works
At the heel of both feet are two I2C sensors. 1) A BNO055 accelerometer/gyroscope 2) A Time-of-flight (Pololu XL53L1X) distance sensor (TOF).
The accel/gyro determines the direction and motion of the foot. The TOF reads the distance of the foot off the ground.
I’ve separated the overhead tasks by having the four foot sensors read by a Teensy 4.1 (Arduino IDE) which is housed in a case at the belt. Those sensor readings are sent via serial to a Raspberry Pi (in the headset) which handles the video and the animation. The video and animation are done with Processing.
Because of the distance between the two pairs of sensors on each foot (at least 1.5 meters between them), they are running on two different buses to the Teensy.
The processing.video.* library isn’t working with the Processing 4 on my Pi 5 and I don’t know what I’m doing wrong. Or how to go about it without it.
Reading both sensors on the main bus is okay. And I’m able to read the first sensor on the second bus…but I don’t know how to reference the second sensor on the second bus.
3) Calibrating the sensors seems to be a pain!
Those are my three main problems at this point. But I’m sure there are more lurking around the corner!
Any help would be greatly appreciated.
Have a Merry Christmas!
bottom of page