-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Expose recorder function for DynamicAutodiff.jl #377
Conversation
Benchmark Results
Benchmark PlotsA plot of the benchmark results have been uploaded as an artifact to the workflow run for this PR. |
Pull Request Test Coverage Report for Build 12223650046Warning: This coverage report may be inaccurate.This pull request's base commit is no longer the HEAD commit of its target branch. This means it includes changes from outside the original pull request, including, potentially, unrelated coverage changes.
Details
💛 - Coveralls |
Turns out Zygote.jl is awful at second-order differentiation, even for tiny operators like |
is there any reason to use Enzyme or Mooncake for this? |
It is probably overkill as all I'm doing is:
^This is nice because we can stack it as many times as we want to get higher-order derivatives (though many packages struggle with complex higher order differentiation). So all I'm using function (d::OperatorDerivative{F,1,1})(x) where {F}
return ForwardDiff.derivative(d.op, x)
end
function (d::OperatorDerivative{F,2,1})(x, y) where {F}
return ForwardDiff.derivative(Fix{2}(d.op, y), x)
end
function (d::OperatorDerivative{F,2,2})(x, y) where {F}
return ForwardDiff.derivative(Fix{1}(d.op, x), y)
end I've had a lot of issues in getting Enzyme into SR (https://github.com/EnzymeAD/Enzyme.jl/issues?q=is:issue%20author:MilesCranmer – though have appreciated the support from the authors) so want to avoid making it a direct dependencies until it is more stable. Mooncake I haven't tried yet but the README says it is still under development so probably want to avoid. On the other hand, ForwardDiff.jl is already an indirect dependency via Optim.jl so I feel pretty safe about using it. What do you think? |
that's totally true, yea |
[Diff since v1.1.0](v1.1.0...v1.2.0) **Merged pull requests:** - fix: add missing `condition_mutation_weights!` to fix #378 (#379) (@MilesCranmer) **Closed issues:** - [BUG]: `nested_constraints` incompatible with `TemplateExpression` (#378)
588b6fc
to
a6ab9cf
Compare
Update: since the derivatives are general to any DynamicExpressions.jl expression, I'm going to register a separate package DynamicAutodiff.jl, then import that to SymbolicRegression.jl |
This makes it so you can put derivative operators in the template structure when doing structured searches. This works with a derivative operator
D
that computes the derivative with respect to thei
-th argument, and can be nested arbitrarily to get higher-order derivatives. However, note that nesting it multiple times could be take a while to compile though, since it will compile n^d operators for d-th order derivatives. (Runtime performance should be unaffected)Here's an example:
This imposes a structure that for some symbolic$f$ and $g$ , the evaluation would use $$f(x, y) - \frac{\partial f(x, y)}{\partial x} + g(y)$$ .
Also helps support feature request from @gm89uk + @DeaglanBartlett. Interested to hear whether you run into any issues for it before I merge it!