-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DEV] Adding julia solvers: GD, SGD, SAGA #1
base: main
Are you sure you want to change the base?
Conversation
@ngazagna can you show here the figures you obtain? |
Thanks a lot @ngazagna for moving the PR! 😃 I will take a look ASAP. It seems that the tests are not run in PRs, we also need to fix the CI. |
it seems there is a bug as the objective function should reach the same
values.
… |
There is indeed a problem with
where pg_i is the i-th component of the projected gradient.
where EDIT: I guess this post is relevant |
it does not explain why it does not reach the same optimal value. tol are
sampled independently for all solvers
… |
could you push a ci trigger commit to make the test run on the PR?
I guess alex is referring to the second plot where GD/SGd/SAGA seem to plateau to a different value. |
@ngazagna I just did an update to fix the failing test so don't forget to use I will try to run the bench and see if I can see why this seems to not converge toward the same value. |
Relies on benchopt/benchopt#78 for julia dependencies and |
It seems there is no more technical bug :) now, the failing test is because SGD does not find the optimal point (which is expected). |
It seems there is no more technical bug :)
now, the failing test is because SGD does not find the optimal point
(which is expected).
Maybe we should set some flag to detect stochastic solvers and skip this
test?
don't try to be too smart. If you really want this I would add an attribute
to solver class
saying if a test can be skipped
|
that was my idea of a flag, something like `approximate_solver` that skip
the optimality test.
Le ven. 2 oct. 2020 à 08:52, Alexandre Gramfort <[email protected]>
a écrit :
… >
> It seems there is no more technical bug :)
>
> now, the failing test is because SGD does not find the optimal point
> (which is expected).
> Maybe we should set some flag to detect stochastic solvers and skip this
> test?
>
don't try to be too smart. If you really want this I would add an attribute
to solver class
saying if a test can be skipped
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#1 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAZKZ6PFEHXYI5R3CXK6Q53SIV2CJANCNFSM4R6BRNNA>
.
|
Sorry, I did not get that Alex was pointing an issue related to SGD/SAGA. I guess we will face this problem as soon as algorithms need parameter tuning and when time in The other (expensive) option is do an extensive grid search (instead of the theoretical step size used here), try and retry until SGD converges. |
forget about SGD. Julia-SAGA is not clear to converge either. Maybe not
enough samples for this benchmark?
… |
Co-authored-by: tomMoral <[email protected]>
No description provided.