-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use measures that are not of the form f(y, yhat)
but f(fitresult)
#202
Comments
It seems likely There is the possibility of using a custom selection heuristic. By default, |
Thanks for responding!
Sorry, I may have not been clear enough there. I meant internal optimization in the sense of candidate model fitting (i.e. model parameter optimization; e.g. fitting an NN's parameters to the data using backprop, just in my case an Evolutionary Algorithm is used instead of gradient descent).
Thank you, I had somehow missed that. As I understand it, the selection heuristic However, for more sophisticated strategies like I myself will probably for now stick with the simpler I'll look into this a bit more and then comment on what I find out. |
I was able to adjust using MLJ
DTRegressor = @load DecisionTreeRegressor pkg = DecisionTree verbosity = 0
N = 300
X, y = rand(N, 3), rand(N)
X = MLJ.table(X)
model = DTRegressor()
space = [
range(model, :max_depth; lower=1, upper=5),
range(
model,
:min_samples_split;
lower=ceil(0.001 * N),
upper=ceil(0.05 * N),
),
]
function myextra(model, fparams)
# fparams = fitted_params(fitted_params(resampling_machine).machine)
return fparams.tree
end
modelt = TunedModel(;
model=model,
resampling=CV(; nfolds=3),
tuning=LatinHypercube(; gens=30),
range=space,
measure=mae,
n=2,
userextras=myextra,
)
macht = machine(modelt, X, y)
MLJ.fit!(macht; verbosity=1000)
display(report(macht).history[1].userextras) However, I now have the problem that this only yields a single evaluation of I guess I'll have to pass |
My bad, I just noticed that there is |
Would you be interested in a PR that introduces the I'd argue that I'm probably not the only one who wants to log and later access additional metrics during optimization. And, in addition to that, for Or is this too niche in your opinion? I'm also fine with keeping this in my personal fork for now if you don't find it useful enough. 🙂 Thanks a lot for your help! |
I'd be happy to review such a PR. Don't have a great idea for the name. What about I know I said |
I think this issue is resolved, albeit through a different means than originally posed. Closing for now. |
Hi, thank you for developing and maintaining this part of MLJ! 🙂
I was wondering how one would go about the following:
I created a custom MLJ model type (let's call it
CustomModel
) which uses internally a custom optimizer (actually, an Evolutionary Algorithm but this is not important) with a custom objective functioncost
1. I'd now like to perform hyperparameter optimization with respect tocost
. At that,cost
is computed by the inner optimizer anyways (since it's optimizing for it) and I can make thefitresult
s ofCustomModel
contain the highest achievedcost
value by the inner optimizer. What I'd like to do is provide instead of e.g.measure=mae
something likemeasure=custom
wherecustom
acceptsfitresult
s (or something along those lines).I looked mostly into
tuned_models.jl
andresampling.jl
and (presumably since I'm not familiar enough with the code) I only saw ways to achieve this that look like a lot of work. Maybe there is another way?So far, it looks to me like writing my own version of
TunedModel
would be less work than changing the existing code. Maybe you can give me a hint as to what I'm missing? Did no one run into this so far?Thank you for your time!
Footnotes
Let's assume that
cost
is actually a decent measure for the inner optimizer. For example,cost
does not only include predictive performance but also model complexity and it's therefore not of the formcost(y, yhat)
but more likecost(y, yhat, complexity, otherstuff)
. ↩The text was updated successfully, but these errors were encountered: