-
Notifications
You must be signed in to change notification settings - Fork 65
[ENH] skforecast integration for time series hyperparameter tuning
#208
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
[ENH] skforecast integration for time series hyperparameter tuning
#208
Conversation
src/hyperactive/experiment/integrations/skforecast_forecasting.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, great!
- please improve docstrings, see above, and also include defaults
- please add
get_test_paramswith sensible settings to test the experiment and the forecaster - please fix code quality issues, use
pre-commit
skforecast integration for time series hyperparameter tuning
|
Hi @fkiraly !! added tested the pre-commit on changed files kindly verify this. |
fkiraly
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the tests to run, you need to add skforecast to the python environment - I would add it to sktime-integration in pyproject.toml, that might be easiest.
|
Hi @fkiraly !! commited the changes as you suggested
|
|
Hi, |
|
Hi @JoaquinAmatRodrigo @fkiraly !!! |
|
It needs to be triggered by one of the repository's maintainers. |
|
Hi @JoaquinAmatRodrigo, @SimonBlanke, and @fkiraly — I’d appreciate your suggestions on how to proceed here. The issue is that
What I have done so far1. Fixes for the
|
fkiraly
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The integration works, nice!
The remaining issues relate to dependency isolation, you have to do it in two places:
- the
get_test_paramsfunction, see above - in the tags (I think you also need to add a few others)
Besides this, the "higher is better" property needs to be inferred from the metric and set as a tag in __init__.
@JoaquinAmatRodrigo, is there a programmatic way to do this?
We do not have a programmatic strategy for this. So far, all the regression metrics that Skforecast allows to be passed as a string are intended to be minimised. "mean_squared_error": mean_squared_error,
"mean_absolute_error": mean_absolute_error,
"mean_absolute_percentage_error": mean_absolute_percentage_error,
"mean_squared_log_error": mean_squared_log_error,
"mean_absolute_scaled_error": mean_absolute_scaled_error,
"root_mean_squared_scaled_error": root_mean_squared_scaled_error,
"median_absolute_error": median_absolute_error,
"symmetric_mean_absolute_percentage_error": symmetric_mean_absolute_percentage_errorIf the user passes a custom function as a metric, they need to indicate whether it is a maximisation or minimisation. |
|
Hi @JoaquinAmatRodrigo !! Thanks for the clarification . |
|
Looks like a great solution. You might want to double-check the keywords, usually libraries use 'maximize' or 'minimize'. Not sure what is the ones used in this library. |
|
Thanks @JoaquinAmatRodrigo !! I checked @fkiraly and @SimonBlanke any suggestions ? |
fkiraly
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why are we making substantial changes to the CI and the depsets? Looks unnecessary. Did an AI suggest this?
Please revert the changes.
I think adding a catch-all all_integrations depset makes sense for testing, but not sure about the rest.
|
Hi @fkiraly !!! |
What you say in the comment you mention is not consistent with the actual changes in the If you are using AI, please watch what it is doing. |
|
Hi @fkiraly
rest is the same as I written in comments . as you mentioned about the changes made in all tests are passing currently btw |



Summary
This PR adds a full integration with skforecast, allowing Hyperactive to optimize hyperparameters of skforecast forecasting models using any of its optimization algorithms.
Implementation Details
SkforecastExperiment (skforecast_forecasting.py)
BaseExperiment.skforecast.model_selection.backtesting_forecasterinside_evaluate()to perform time-series cross-validation for each parameter set.set_params()before every evaluation.SkforecastOptCV (skforecast_opt_cv.py)
sklearn-style estimator (inherits from
BaseEstimator).Works with
ForecasterRecursiveand other compatible skforecast forecasters.fit():
SkforecastExperimentwith user settings (steps,initial_train_size,metric, etc.).predict():
best_forecaster_.predict()for easy forecasting after optimization.Configuration
pyproject.tomlunder theintegrationsextra.Verification
skforecast_example.pyshowing a HillClimbing search withForecasterRecursive+RandomForestRegressor.Closes
Fixes #199