Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2D backbone pretraining details #128

Open
Deng-Y opened this issue Mar 28, 2021 · 0 comments
Open

2D backbone pretraining details #128

Deng-Y opened this issue Mar 28, 2021 · 0 comments

Comments

@Deng-Y
Copy link

Deng-Y commented Mar 28, 2021

Hello,

May I ask for more details regarding 2D backbone pretraining? Quoted from the paper:

The 2D backbone for Human3.6M was pretrained on the COCO dataset [10] and finetuned jointly on MPII and Human3.6M for 10 epochs using the Adam optimizer with 10^−4 learning rate.

I have the following questions:

  1. What loss function do you use in pretraining?
  2. The setting "for 10 epochs using the Adam optimizer with 10^−4 learning rate" seems only to apply to the fine-tuning on MPII and Human3.6M, then what's the setting for the COCO dataset?
  3. Can you provide the pretraining code or recommend a GitHub repository for pretraining?

I found that pretraining is very important. I trained the volumetric model without loading pretrained weights of the 2D backbone, and the relative MPJPE increase drastically from ~20 to ~30. Thus, 2D backbone pertaining is quite necessary.

Look forward to hearing from you. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant