This is a tool built using PeekingDuck's (v1.2.0) pose estimation to detect changes in sitting posture in a video recording.
This section explains the setup for a default run of the submitted solution detecting postures of a person facing right in the video.
- Clone repository.
$ git clone https://github.com/JoyLinWQ/posture_corrector.git
$ cd posture_corrector/pose_estimation
- Create a virtual environment (python 3.6 to 3.9) and install PeekingDuck. For Apple Silicon Mac users, follow Custom Install here.
$ conda create --name my_pd python=3.9
$ conda activate my_pd
$ pip install -U peekingduck
- Run Posture Corrector!
$ peekingduck run
- Watch the output in a new popup window (below), where pose estimation draws shoulder and hip keypoints that are used to calculate whether a particular posture is good or bad. This output will be automatically saved to
"...\posture_corrector\pose_estimation\PeekingDuck\data\output\<timestamp>.mp4"
when the run is completed.
A good posture is where the shoulder keypoint is before the hip keypoint, and vice versa. A small offset is added to the hip keypoint to correct slight differences in estimated poses from actual posture.
This section guides more adventurous users to add your own video data source with simple changes to the configuration file.
You can source for your own video containing either right-facing or left-facing postures. You may trim the video to ensure that there is a human present in all frames.
For videos obtained from YouTube, you may refer to Step 1 below.
- Convert YouTube video to MP4 using Online Video Converter.
- Trim video to desired frames of interest.
- Store trimmed videos in
data
folder.
Sample videos:
Two samples of trimmed video containing left-facing or right-facing postures are available in ...\posture_corrector\pose_estimation\data\sample\
for your use.
Before running, perform 3 simple configurations for your custom data sources in pose_estimation/pipeline_config.yml
and pose_estimation\src\custom_nodes\dabble\assess.py
. The sample containing right-facing posture is used as an example here.
- Input source in
pipeline_config.yml
: Update the path to your input video source, relative to the project folderposture_corrector/pose_estimation
.
Example:
- input.visual:
source: data/sample/right.mp4
- Direction in
assess.py
: Update the direction in which person is facing in the video, either to "left" or "right".
Example:
DIRECTION = "right"
- Offset in
assess.py
: You may wish to adjust the offset to see which value is more suitable for your video of interest. This uses scaled x coordinates of keypoints instead of actual image coordinates.
Example:
OFFSET = 0.05
- Run:
$ cd posture_corrector/pose_estimation
$ peekingduck run
-
More keypoints like ears and knees can be included to have a better overall assessment of the posture.
-
Real-time video streaming from a camera strategically placed at a user's left or right side may be used as an input data source, and the output posture detected could be sent to user's mobile application. A desired feedback could be short warning beeps when bad posture is detected so that user can instantly correct his/her posture.
Joy Lin Email | GitHub Repository Link
Submitted: April 2022
- PeekingDuck (v1.2.0) developed by AI Singapore Computer Vision Hub
- Videos used in submission:
- Sample (right-facing): Best posture for sitting
- Sample (left-facing): How to fix & improve your sitting posture
- Solution (right-facing): Sitting posture correction