Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: Cannot predict with MultitargetSRRegressor, ERROR: MethodError: no method matching zero(::@NamedTuple{a::Vector{Float64}, b::Vector{Float64}, c::Vector{Float64}}) The function zero exists, but no method is defined for this combination of argument types. #383

Closed
Klipp-Linding-Lab opened this issue Dec 10, 2024 · 3 comments
Assignees
Labels
bug Something isn't working

Comments

@Klipp-Linding-Lab
Copy link

What happened?

Dear all, we cannot execute this code nor our own for a similar purpose; this code is copy/paste from the API doc.

using MLJ
MultitargetSRRegressor = @load MultitargetSRRegressor pkg=SymbolicRegression
X = (a=rand(100), b=rand(100), c=rand(100))
Y = (y1=(@. cos(X.c) * 2.1 - 0.9), y2=(@. X.a * X.b + X.c))
model = MultitargetSRRegressor(binary_operators=[+, -, *], unary_operators=[exp], niterations=100)
mach = machine(model, X, Y)
fit!(mach)
y_hat = predict(mach, X)

View the equations used:

r = report(mach)
for (output_index, (eq, i)) in enumerate(zip(r.equation_strings, r.best_idx))
println("Equation used for ", output_index, ": ", eq[i])
end

Version

1.11.2

Operating System

Linux

Interface

Julia REPL

Relevant log output

julia> y_hat = predict(mach, X)
       # View the equations used:
ERROR: MethodError: no method matching zero(::@NamedTuple{a::Vector{Float64}, b::Vector{Float64}, c::Vector{Float64}})
The function `zero` exists, but no method is defined for this combination of argument types.

Closest candidates are:
  zero(::Type{Union{}}, Any...)
   @ Base number.jl:310
  zero(::Type{MutableArithmetics.Zero})
   @ MutableArithmetics ~/.julia/packages/MutableArithmetics/BLlgj/src/rewrite.jl:35
  zero(::Type{Dates.Date})
   @ Dates /usr/share/julia/stdlib/v1.11/Dates/src/types.jl:459
  ...

Stacktrace:
 [1] alg_cache(alg::Vern7{…}, u::@NamedTuple{…}, rate_prototype::@NamedTuple{…}, ::Type{…}, ::Type{…}, ::Type{…}, uprev::@NamedTuple{…}, uprev2::@NamedTuple{…}, f::ODEFunction{…}, t::Float64, dt::Float64, reltol::Float64, p::Machine{…}, calck::Bool, ::Val{…})
   @ OrdinaryDiffEqVerner ~/.julia/packages/OrdinaryDiffEqVerner/m0ylj/src/verner_caches.jl:96
 [2] __init(prob::ODEProblem{…}, alg::Vern7{…}, timeseries_init::Tuple{}, ts_init::Tuple{}, ks_init::Tuple{}, recompile::Type{…}; saveat::Vector{…}, tstops::Tuple{}, d_discontinuities::Tuple{}, save_idxs::Nothing, save_everystep::Bool, save_on::Bool, save_start::Bool, save_end::Nothing, callback::Nothing, dense::Bool, calck::Bool, dt::Float64, dtmin::Float64, dtmax::Float64, force_dtmin::Bool, adaptive::Bool, gamma::Rational{…}, abstol::Float64, reltol::Float64, qmin::Rational{…}, qmax::Int64, qsteady_min::Int64, qsteady_max::Int64, beta1::Nothing, beta2::Nothing, qoldinit::Rational{…}, controller::Nothing, fullnormalize::Bool, failfactor::Int64, maxiters::Int64, internalnorm::typeof(DiffEqBase.ODE_DEFAULT_NORM), internalopnorm::typeof(opnorm), isoutofdomain::typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN), unstable_check::typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK), verbose::Bool, timeseries_errors::Bool, dense_errors::Bool, advance_to_tstop::Bool, stop_at_next_tstop::Bool, initialize_save::Bool, progress::Bool, progress_steps::Int64, progress_name::String, progress_message::typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE), progress_id::Symbol, userdata::Nothing, allow_extrapolation::Bool, initialize_integrator::Bool, alias_u0::Bool, alias_du0::Bool, initializealg::OrdinaryDiffEqCore.DefaultInit, kwargs::@Kwargs{})
   @ OrdinaryDiffEqCore ~/.julia/packages/OrdinaryDiffEqCore/H25Bn/src/solve.jl:360
 [3] __solve(::ODEProblem{…}, ::Vern7{…}; kwargs::@Kwargs{…})
   @ OrdinaryDiffEqCore ~/.julia/packages/OrdinaryDiffEqCore/H25Bn/src/solve.jl:6
 [4] solve_call(_prob::ODEProblem{…}, args::Vern7{…}; merge_callbacks::Bool, kwargshandle::Nothing, kwargs::@Kwargs{…})
   @ DiffEqBase ~/.julia/packages/DiffEqBase/HW4ge/src/solve.jl:632
 [5] solve_up(prob::ODEProblem{…}, sensealg::QuadratureAdjoint{…}, u0::@NamedTuple{…}, p::Machine{…}, args::Vern7{…}; kwargs::@Kwargs{…})
   @ DiffEqBase ~/.julia/packages/DiffEqBase/HW4ge/src/solve.jl:1120
 [6] solve(prob::ODEProblem{…}, args::Vern7{…}; sensealg::QuadratureAdjoint{…}, u0::Nothing, p::Nothing, wrap::Val{…}, kwargs::@Kwargs{…})
   @ DiffEqBase ~/.julia/packages/DiffEqBase/HW4ge/src/solve.jl:1036
 [7] predict(θ::Machine{…}, X::@NamedTuple{…}, T::Vector{…})
   @ Main ~/JuliaCode/Phospho_work_multi.jl:79
 [8] predict(θ::Machine{…}, X::@NamedTuple{…})
   @ Main ~/JuliaCode/Phospho_work_multi.jl:78
 [9] top-level scope
   @ REPL[57]:1
Some type information was truncated. Use `show(err)` to see complete types.

julia> r = report(mach)
(best_idx = [12, 3],
 equations = Vector{Expression{Float64, DynamicExpressions.NodeModule.Node{Float64}, @NamedTuple{operators::DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(-), typeof(*)}, Tuple{typeof(exp)}}, variable_names::Vector{String}}}}[[0.881348865751391, 1.3756828183029748 - c, (c * -0.9719064318724064) + 1.3617952137309295, (exp(c) * -0.5878680910661597) + 1.8809092039980162, ((c * -0.9809376280537772) * c) + 1.1931951498025708, (((c * -0.906876499286865) - 0.07757180782560917) * c) + 1.2079970263208184, ((c * (-1.0995189493322781 - (c * -0.13000357499675755))) * c) + 1.20103431524694, ((c * c) * (-1.1287057437085772 - (exp(c) * -0.05985652480674373))) + 1.2003258712015616, ((c * (-1.048886141802872 - (c * (c * -0.0838688737571194)))) * c) + 1.1999494876771033, ((c * (-1.0519066959939267 - (((c * -0.07901736953345984) + -0.007620780644438593) * c))) * c) + 1.2000182742759145, (c * ((-1.0520028270192574 - ((((c * -0.07945100368209045) + -0.007917907493167769) * c) * 0.9926495499429491)) * c)) + 1.2000205805924693, ((c * (-1.0495894647085022 - (c * ((exp(exp(c)) * 0.00024345834052550238) + (c * -0.08785656137392589))))) * c) + 1.1999906355972207, (((c * -1.0467501249516413) - (((exp(c + c) * 0.002105260850929585) + (c * (c * -0.09691448575689704))) * c)) * c) + 1.1999846245675594], [c, c + 0.2483967687211474, c + (a * b)]],
 equation_strings = [["0.881348865751391", "1.3756828183029748 - c", "(c * -0.9719064318724064) + 1.3617952137309295", "(exp(c) * -0.5878680910661597) + 1.8809092039980162", "((c * -0.9809376280537772) * c) + 1.1931951498025708", "(((c * -0.906876499286865) - 0.07757180782560917) * c) + 1.2079970263208184", "((c * (-1.0995189493322781 - (c * -0.13000357499675755))) * c) + 1.20103431524694", "((c * c) * (-1.1287057437085772 - (exp(c) * -0.05985652480674373))) + 1.2003258712015616", "((c * (-1.048886141802872 - (c * (c * -0.0838688737571194)))) * c) + 1.1999494876771033", "((c * (-1.0519066959939267 - (((c * -0.07901736953345984) + -0.007620780644438593) * c))) * c) + 1.2000182742759145", "(c * ((-1.0520028270192574 - ((((c * -0.07945100368209045) + -0.007917907493167769) * c) * 0.9926495499429491)) * c)) + 1.2000205805924693", "((c * (-1.0495894647085022 - (c * ((exp(exp(c)) * 0.00024345834052550238) + (c * -0.08785656137392589))))) * c) + 1.1999906355972207", "(((c * -1.0467501249516413) - (((exp(c + c) * 0.002105260850929585) + (c * (c * -0.09691448575689704))) * c)) * c) + 1.1999846245675594"], ["c", "c + 0.2483967687211474", "c + (a * b)"]],
 losses = [[0.07312460305530863, 0.003716344408720649, 0.0036583028508886477, 0.0007973156443596637, 3.198335137167272e-5, 6.113176053502374e-6, 3.9148436746139914e-7, 2.0797653690360404e-8, 1.9244315475448397e-9, 1.6390605075791748e-10, 1.576166129196469e-10, 4.662930288906626e-11, 4.380522004124344e-11], [0.11494901030186083, 0.053248055568929614, 0.0]],
 complexities = [[1, 3, 5, 6, 7, 9, 11, 12, 13, 15, 17, 19, 22], [1, 3, 5]],
 scores = [[36.04365338911715, 1.4897121889644036, 0.007870580688856953, 1.5235039759207538, 3.216035143211375, 0.8273845236461245, 1.3741280723447382, 2.9351054077812564, 2.380209552843019, 1.2315462503456258, 0.019563909324637165, 0.6089682114621419, 0.020825390521924444], [36.04365338911715, 0.384763676286292, 18.021826694558577]],)

julia> for (output_index, (eq, i)) in enumerate(zip(r.equation_strings, r.best_idx))
           println("Equation used for ", output_index, ": ", eq[i])
       end
Equation used for 1: ((c * (-1.0495894647085022 - (c * ((exp(exp(c)) * 0.00024345834052550238) + (c * -0.08785656137392589))))) * c) + 1.1999906355972207
Equation used for 2: c + (a * b)

Extra Info

our own code gives:
ERROR: MethodError: no method matching zero(::@NamedTuple{data::@NamedTuple{u1::Vector{…}, u2::Vector{…}, u3::Vector{…}, u4::Vector{…}}, idx::Vector{Int64}})
The function zero exists, but no method is defined for this combination of argument types.

Closest candidates are:
zero(::Type{Union{}}, Any...)
@ Base number.jl:310
zero(::Type{MutableArithmetics.Zero})
@ MutableArithmetics ~/.julia/packages/MutableArithmetics/BLlgj/src/rewrite.jl:35
zero(::Type{Dates.Date})
@ Dates /usr/share/julia/stdlib/v1.11/Dates/src/types.jl:459

Even if we try something like:

predict(mach,(data=X1x, idx=[4,4,5,3]))
ERROR: MethodError: no method matching zero(::@NamedTuple{data::@NamedTuple{u1::Vector{…}, u2::Vector{…}, u3::Vector{…}, u4::Vector{…}}, idx::Vector{Int64}})
The function zero exists, but no method is defined for this combination of argument types.

Our environment is:
(@v1.11) pkg> status
Status ~/.julia/environments/v1.11/Project.toml
⌃ [8ce10254] Bumper v0.6.0
[b0b7db55] ComponentArrays v0.15.19
[2445eb08] DataDrivenDiffEq v1.5.0
[5b588203] DataDrivenSparse v0.1.2
[0c46a032] DifferentialEquations v7.15.0
[98e50ef6] JuliaFormatter v1.0.62
[23fbe1c1] Latexify v0.16.5
[d3d80556] LineSearches v7.3.0
[b2108857] Lux v1.4.1
[add582a8] MLJ v0.20.7
[961ee093] ModelingToolkit v9.57.0
[7f7a1694] Optimization v4.0.5
[36348300] OptimizationOptimJL v0.4.1
[42dfb2eb] OptimizationOptimisers v0.3.6
[1dea7af3] OrdinaryDiffEq v6.90.1
[91a5bcdd] Plots v1.40.9
[1ed8b502] SciMLSensitivity v7.71.2
[860ef19b] StableRNGs v1.0.2
[8254be44] SymbolicRegression v1.3.0
[d1185830] SymbolicUtils v3.7.2
[e88e6eb3] Zygote v0.6.73

@Klipp-Linding-Lab Klipp-Linding-Lab added the bug Something isn't working label Dec 10, 2024
@MilesCranmer
Copy link
Owner

It looks like you defined a function predict in your script? The one SymbolicRegression.jl uses is from MLJ. Your traceback looks to have some OrdinaryDiffEqCore stuff - which SymbolicRegression.jl doesn't depend on.

@Klipp-Linding-Lab
Copy link
Author

Dear Miles

Ofcourse; how embarrassing ;) this explains a lot. Many thanks and sorry for bug report.

best wishes

@MilesCranmer
Copy link
Owner

No worries :)

@MilesCranmer MilesCranmer closed this as not planned Won't fix, can't repro, duplicate, stale Dec 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants