Skip to content

Conversation

@blegat
Copy link
Member

@blegat blegat commented Nov 27, 2025

I did this some months ago in an attempt to see if we could get allocation-free iterations after jump-dev/MathOptInterface.jl#2740.
Before this PR, it relies on calling MOI.eval_objective(model, x) which access model.nlp_data.evaluator which is type-unstable since the type of evaluator is not concrete:
https://github.com/jump-dev/MathOptInterface.jl/blob/034a5272dad29a233783cc49ed7e15195450dd96/src/nlp.jl#L176
So the fix in this PR is to give to Ipopt closures that capture the evaluator, not the model.
Now even after this, we're not allocation-free yet because there is the type-instability in accessing

eval_f::Function
eval_g::Function
eval_grad_f::Function
eval_jac_g::Function

We can add the type of these functions as parameters to IpoptProblem but then we need to modify this type-assert:
prob = unsafe_pointer_to_objref(user_data)::IpoptProblem

and I'm not sure how to make it work with @ccallable

Ipopt.jl/src/C_wrapper.jl

Lines 190 to 191 in 077843b

eval_f_cb = @cfunction(
_Eval_F_CB,

As discussed during the JuMP-dev hackathon, maybe #514 could help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

2 participants