-
Notifications
You must be signed in to change notification settings - Fork 137
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HCD module regression / Broken, Symmetric Incorrect arm placement behind torso on MNET3 #116
Comments
The issue of inverted or broken arms bending the wrong way in its core stems from the mathematical ambiguity of regressing a 3D structure from 2D keypoints. MocapNET receives 2D points as input and regresses a 3D BVH skeleton that corresponds to the 2D observations. Mathematically however it is almost impossible for the method to discern e.g. the following 4 poses
The particular dataset used to work ok, but it seems some new optimizations done in the HCD module (for MNET4) now inversely affect MNET3 return solutions with incorrect arm orientations. I know that the cause is the HCD module and not the neural network since by disabling the IK module with the --noik command
as you will see just taking the input from the neural network hands are bending forward.. TLDR: https://github.com/FORTH-ModelBasedTracker/MocapNET/assets/97630/78916d63-fa43-4591-ac57-c14ec3e6b7e2 Sorry about the inconvenience, maintaining the same HCD module for 4 different branches (out of which the MNET4 is written in Python and has another wrapper) is a nightmare, however I will do my best to resolve this. Looking forward for comments or "better" workarounds in the case you can help with this. |
What am I doing wrong here? Also, does it support OpenPose's 25 model outputs with the face and the whole body? |
Sorry about this, you are not doing something wrong, I think this is caused by programmatically setting the abdomen and chest bvh joints to zero to nail the armature ( done here https://github.com/FORTH-ModelBasedTracker/MocapNET/blob/mnet4/src/python/mnet4/MocapNET.py#L600 ) .. The mnet4 code in github was automatically copied by my dev branch and this got carried over.. Unfortunately due to preparing my PhD thesis and being on back-to-back trips for the last 3 months I am having trouble maintaining the code, resulting in the sub-par quality and problems you encountered. That being said I will try to fix this.. |
Also another issue I remembered, but not sure if it is related is this : |
Yes That being said mediapipe is used as a pose estimation source in the Google Collab since it is faster and more portable and the mediapipe joints are "cast" to the OpenPose ones. |
Just an update, this seems to be an issue with the blender script and not the main MocapNET BVH file, and it is caused by not setting some joints to get mirrored and them remaining set to 0. |
Just to make sure this is the case, I opened the generated bvh file using a 3rd party BVH utility BVHacker test1.mp4And here's the debug output video generated: livelastRun3DHiRes.mp4 |
hey yeah I am expericing the same problem! I thought this was stable!? |
so what can i do in the blender , creat scripts to solve the problem , can you tell me how to finish the code,i havnt no idea |
It works now but the output joints from the bvh file is messed up, as you can see in the image below
Originally posted by @justinjohn0306 and @ArEnSc in #115 (comment)
The text was updated successfully, but these errors were encountered: