-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Composition constraint not satisfied? #71
Comments
I'm digging a bit deeper in the code. Could it be that the ax_client.get_trials_data_frame() dataframe does not show the updated x5 (1.0 - (x1+x2+x3+x4))? In that case it looks like the sum does not add up to one in the dataframe but in fact the composition constraint is correctly applied. |
Thanks for catching! @AndrewFalkowski could you update the tutorial such that x5 is "hidden" from Ax's perspective? (I.e., shouldn't appear in the search space, gets calculated outside of Ax before passing the parameters into the obj. function) |
Great! I really like the tutorials with examples from the materials field, thanks a lot! |
Hi, |
Hi @henkjanvanmanen, I just finished updating the tutorial (sorry for the delay). This should now be self-consistent. Could you have a look and let me know if this makes sense now? |
@henkjanvanmanen, facebook/Ax#727 might also provide some helpful context. |
Hi @sgbaird, thanks for the update of the mobo tutorial. Looking at the updated notebook, the ax_client instance now never sees/uses the x5 parameter, correct? I ran the updated notebook both in Colab and in my local environment (both with the same AxClient random_seed of 12345). The output from Colab is shown below (output from running in my local environment is quite similar). What I notice is that the location of the pareto front differs from the image shown on the Honegumi site. Lower biodegradability values are found by the Bayesian optimization method (max. biodegradability around 11) compared to before (max. about 16), what could be the explanation for this? And do you see this as well? |
Since the search space changed, the optimization won't be identical. Also, it's a single search campaign, so there's stochasticity in the performance. If you want to dig in further, it would probably be best to do two things:
This gets into the topic of robust benchmarking. Given that the objective function is made up, I'm not sure it's worth spending a lot of time on this specific task. |
Large Sobol sampling via Ax example: https://github.com/sparks-baird/matsci-opt-benchmarks/blob/main/scripts%2Fcrabnet_hyperparameter%2Fcrabnet_hyperparameter_submitit.py#L75-L78 |
Hello,
I like the Honegumi philosophy and decided to try and reproduce the results from the tutorial Multi Objective Optimization of Polymers for Strength and Biodegradability using the Google Colab link (mobo.ipynb) provided on the Honegumi website.
When running ax_client.get_trials_data_frame() after the 35 trials (Sobol and BoTorch) of the tutorial and inspecting this dataframe, I noticed that the composition constraint (sum of the x1-x5 feature values should equal 1.0) was not satisfied for the large majority of trials.
Can someone confirm that this is indeed the case?
Could there be an issue with the way the composition constraint is currently defined/coded?
Thanks for any feedback/help!
The text was updated successfully, but these errors were encountered: