-
Notifications
You must be signed in to change notification settings - Fork 254
Fix CI docs build #822
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Fix CI docs build #822
Conversation
Kept throwing errors when building docs
Files generated when running docs/make.jl
Created during doc building
|
The BVP Ascher thing fails on LinearSolve v3.46.0, but works with 3.45.0. MWE script: using Pkg
Pkg.activate(temp=true)
ENV["JULIA_PKG_PRECOMPILE_AUTO"] = false
Pkg.add(name="LinearSolve", version=ARGS[1])
Pkg.add("BoundaryValueDiffEq")
import BoundaryValueDiffEq as BVP
Pkg.status()
function f!(du, u, p, t)
e = 2.7
du[1] = (1 + u[2] - sin(t)) * u[4] + cos(t)
du[2] = cos(t)
du[3] = u[4]
du[4] = (u[1] - sin(t)) * (u[4] - e^t)
end
function bc!(res, u, p, t)
res[1] = u[1]
res[2] = u[3] - 1
res[3] = u[2] - sin(1.0)
end
u0 = [0.0, 0.0, 0.0, 0.0]
tspan = (0.0, 1.0)
fun = BVP.BVPFunction(f!, bc!, mass_matrix = [1 0 0 0; 0 1 0 0; 0 0 1 0; 0 0 0 0])
prob = BVP.BVProblem(fun, u0, tspan)
sol = BVP.solve(prob,
BVP.Ascher4(; zeta = [0.0, 0.0, 1.0], jac_alg = BVP.BVPJacobianAlgorithm(BVP.AutoForwardDiff()));
dt = 0.01)run as: |
|
The problems from DiffEqProblemLibrary.jl appear to be the older ones. I think a new version is needed for (ODEProblemLibrary, SDEProblemLibrary, DDEProblemLibrary, DAEProblemLibrary, BVProblemLibrary) |
|
What's the error? We should make the docs build fail on error |
|
Running with 3.46 gives $ julia mwe.jl 3.46.0
Activating new project at `/tmp/jl_eW8Fpj`
Resolving package versions...
Installed LinearSolve ─ v3.46.0
Updating `/tmp/jl_eW8Fpj/Project.toml`
⌃ [7ed4a6bd] + LinearSolve v3.46.0
Updating `/tmp/jl_eW8Fpj/Manifest.toml`
[... Pkg output ...]
Resolving package versions...
Updating `/tmp/jl_eW8Fpj/Project.toml`
[764a87c0] + BoundaryValueDiffEq v5.18.0
Updating `/tmp/jl_eW8Fpj/Manifest.toml`
[... Pkg output ...]
Precompiling BoundaryValueDiffEq finished.
15 dependencies successfully precompiled in 425 seconds. 198 already precompiled.
Status `/tmp/jl_eW8Fpj/Project.toml`
[764a87c0] BoundaryValueDiffEq v5.18.0
⌃ [7ed4a6bd] LinearSolve v3.46.0
Info Packages marked with ⌃ have new versions available and may be upgradable.
┌ Error: BLAS/LAPACK dgetrf: Matrix is singular
│ Details: U(1,1) is exactly zero. The factorization has been completed, but U is singular and division by U will produce infinity.
│ Return code (info): 1
│ matrix_size: (303, 303)
│ rhs_type: Vector{Float64}
│ rhs_size: (303,)
│ memory_usage_MB: 0.7
│ matrix_type: Matrix{Float64}
│ element_type: Float64
└ @ LinearSolve ~/.julia/packages/LinearSolve/V1Dfx/src/mkl.jl:267
ERROR: LoadError: BLAS/LAPACK dgetrf: Matrix is singular
Details: U(1,1) is exactly zero. The factorization has been completed, but U is singular and division by U will produce infinity.
Return code (info): 1
matrix_size: (303, 303)
rhs_type: Vector{Float64}
rhs_size: (303,)
memory_usage_MB: 0.7
matrix_type: Matrix{Float64}
element_type: Float64
Stacktrace:
[1] emit_message(message::String, level::Base.CoreLogging.LogLevel, file::String, line::Int64, _module::Module)
@ SciMLLogging ~/.julia/packages/SciMLLogging/3uoc4/src/utils.jl:57
[2] macro expansion
@ ~/.julia/packages/SciMLLogging/3uoc4/src/utils.jl:147 [inlined]
[3] solve!(cache::LinearSolve.LinearCache{Matrix{Float64}, Vector{Float64}, Vector{Float64}, SciMLBase.NullParameters, LinearSolve.DefaultLinearSolver, [types elided]...)
@ LinearSolve ~/.julia/packages/LinearSolve/V1Dfx/src/mkl.jl:267
[4] solve!
@ ~/.julia/packages/LinearSolve/V1Dfx/src/mkl.jl:239 [inlined]
[5] macro expansion
@ ~/.julia/packages/LinearSolve/V1Dfx/src/default.jl:514 [inlined]
[6] solve!(::LinearSolve.LinearCache{Matrix{Float64}, Vector{Float64}, Vector{Float64}, SciMLBase.NullParameters, LinearSolve.DefaultLinearSolver, [types elided]...)
@ LinearSolve ~/.julia/packages/LinearSolve/V1Dfx/src/default.jl:503
[7] solve!
@ ~/.julia/packages/LinearSolve/V1Dfx/src/default.jl:503 [inlined]
[8] #solve!#21
@ ~/.julia/packages/LinearSolve/V1Dfx/src/common.jl:441 [inlined]
[9] solve!
@ ~/.julia/packages/LinearSolve/V1Dfx/src/common.jl:440 [inlined]
[10] #_#1
@ ~/.julia/packages/NonlinearSolveBase/mJsVc/ext/NonlinearSolveBaseLinearSolveExt.jl:24 [inlined]
[11] LinearSolveJLCache
@ ~/.julia/packages/NonlinearSolveBase/mJsVc/ext/NonlinearSolveBaseLinearSolveExt.jl:14 [inlined]
[12] solve!(cache::NonlinearSolveBase.NewtonDescentCache{Vector{Float64}, Nothing, [types elided]...)
@ NonlinearSolveBase ~/.julia/packages/NonlinearSolveBase/mJsVc/src/descent/newton.jl:119
[13] solve! (repeats 2 times)
@ ~/.julia/packages/NonlinearSolveBase/mJsVc/src/descent/newton.jl:94 [inlined]
[14] step!(cache::NonlinearSolveFirstOrder.GeneralizedFirstOrderAlgorithmCache{Vector{Float64}, Vector{Float64}, Vector{Float64}, SciMLBase.NullParameters, [types elided]...)
@ NonlinearSolveFirstOrder ~/.julia/packages/NonlinearSolveFirstOrder/inFag/src/solve.jl:266
[15] step!
@ ~/.julia/packages/NonlinearSolveFirstOrder/inFag/src/solve.jl:244 [inlined]
[16] #step!#180
@ ~/.julia/packages/NonlinearSolveBase/mJsVc/src/solve.jl:559 [inlined]
[17] step!
@ ~/.julia/packages/NonlinearSolveBase/mJsVc/src/solve.jl:553 [inlined]
[18] solve!(cache::NonlinearSolveFirstOrder.GeneralizedFirstOrderAlgorithmCache{Vector{Float64}, Vector{Float64}, Vector{Float64}, SciMLBase.NullParameters, [types elided]...)
@ NonlinearSolveBase ~/.julia/packages/NonlinearSolveBase/mJsVc/src/solve.jl:287
[19] #__solve#155
@ ~/.julia/packages/NonlinearSolveBase/mJsVc/src/solve.jl:273 [inlined]
[20] __solve
@ ~/.julia/packages/NonlinearSolveBase/mJsVc/src/solve.jl:270 [inlined]
[21] macro expansion
@ ~/.julia/packages/NonlinearSolveBase/mJsVc/src/solve.jl:478 [inlined]
[22] __generated_polysolve(::SciMLBase.NonlinearProblem{Vector{Float64}, true, SciMLBase.NullParameters, [types elided]...)
@ NonlinearSolveBase ~/.julia/packages/NonlinearSolveBase/mJsVc/src/solve.jl:423
[23] __generated_polysolve
@ ~/.julia/packages/NonlinearSolveBase/mJsVc/src/solve.jl:423 [inlined]
[24] #__solve#166
@ ~/.julia/packages/NonlinearSolveBase/mJsVc/src/solve.jl:396 [inlined]
[25] __perform_ascher_iteration(cache::BoundaryValueDiffEqAscher.AscherCache{true, Float64, SciMLBase.BVProblem{Vector{Float64}, Tuple{Float64, Float64}, [types elided]...)
@ BoundaryValueDiffEqAscher ~/.julia/packages/BoundaryValueDiffEqAscher/CPeel/src/ascher.jl:162
[27] #__solve#34
@ ~/.julia/packages/BoundaryValueDiffEqCore/hFduL/src/BoundaryValueDiffEqCore.jl:38 [inlined]
[28] __solve
@ ~/.julia/packages/BoundaryValueDiffEqCore/hFduL/src/BoundaryValueDiffEqCore.jl:35 [inlined]
[29] #solve_call#23
@ ~/.julia/packages/DiffEqBase/aB45d/src/solve.jl:127 [inlined]
[30] solve_call
@ ~/.julia/packages/DiffEqBase/aB45d/src/solve.jl:84 [inlined]
[31] solve_up(prob::SciMLBase.BVProblem{Vector{Float64}, Tuple{Float64, Float64}, true, Nothing, SciMLBase.NullParameters, SciMLBase.BVPFunction{true, [types elided]...)
@ DiffEqBase ~/.julia/packages/DiffEqBase/aB45d/src/solve.jl:563
[32] solve_up
@ ~/.julia/packages/DiffEqBase/aB45d/src/solve.jl:540 [inlined]
[33] #solve#29
@ ~/.julia/packages/DiffEqBase/aB45d/src/solve.jl:530 [inlined]
[34] top-level scope
@ ~/mwe.jl:25
[35] include(mod::Module, _path::String)
@ Base ./Base.jl:306
[36] exec_options(opts::Base.JLOptions)
@ Base ./client.jl:317
[37] _start()
@ Base ./client.jl:550
in expression starting at mwe.jl:25
|
|
Created SciML/LinearSolve.jl#830 for the BVP example failing |
|
A local build was successful, and the LinearSolve.jl issue (#830) got solved... somehow, so I'm marking this as ready for review 👍 |
Checklist
Additional context
Fixes errors in buildkite CI docs build.
Currently in draft mode: the example in tutorials/bvp_example.md (Ascher method for solving BVDAEs) fails on the newest LinearSolve.jl (3.47.0), but if pinned to LinearSolve.jl v3.40.0, it does produce results. Not sure what the underlying issues are yet.