Skip to content

Conversation

@Omswastik-11
Copy link

@Omswastik-11 Omswastik-11 commented Nov 20, 2025

Summary

This PR adds a full integration with skforecast, allowing Hyperactive to optimize hyperparameters of skforecast forecasting models using any of its optimization algorithms.

Implementation Details

SkforecastExperiment (skforecast_forecasting.py)

  • Inherits from BaseExperiment.
  • Uses skforecast.model_selection.backtesting_forecaster inside _evaluate() to perform time-series cross-validation for each parameter set.
  • clones the forecaster and applies new parameters with set_params() before every evaluation.

SkforecastOptCV (skforecast_opt_cv.py)

  • sklearn-style estimator (inherits from BaseEstimator).

  • Works with ForecasterRecursive and other compatible skforecast forecasters.

  • fit():

    • Builds a SkforecastExperiment with user settings (steps, initial_train_size, metric, etc.).
    • runs Hyperactive’s optimizer to search for the best hyperparameters.
    • refits the best forecaster on all available data.
  • predict():

    • delegates to best_forecaster_.predict() for easy forecasting after optimization.

Configuration

  • added skforecast as an optional dependency in pyproject.toml under the integrations extra.

Verification

  • added skforecast_example.py showing a HillClimbing search with ForecasterRecursive + RandomForestRegressor.
  • added unit tests to verify parameter handling, experiment execution, and integration flow.

Closes

Fixes #199

@Omswastik-11 Omswastik-11 changed the title ENH] Add skforecast integration for time series hyperparameter tuning [ENH] Add skforecast integration for time series hyperparameter tuning Nov 20, 2025
@fkiraly fkiraly added enhancement New feature or request module:integrations Integrations for applying optimization to other libraries labels Nov 22, 2025
Copy link
Collaborator

@fkiraly fkiraly left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, great!

  • please improve docstrings, see above, and also include defaults
  • please add get_test_params with sensible settings to test the experiment and the forecaster
  • please fix code quality issues, use pre-commit

@fkiraly fkiraly changed the title [ENH] Add skforecast integration for time series hyperparameter tuning [ENH] skforecast integration for time series hyperparameter tuning Nov 22, 2025
@Omswastik-11 Omswastik-11 requested a review from fkiraly November 22, 2025 17:07
@Omswastik-11
Copy link
Author

Omswastik-11 commented Nov 22, 2025

Hi @fkiraly !!
Modified the docs string. As you suggested .

added get_test_params() verified all the tests .

tested the pre-commit on changed files
image

kindly verify this.

Copy link
Collaborator

@fkiraly fkiraly left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the tests to run, you need to add skforecast to the python environment - I would add it to sktime-integration in pyproject.toml, that might be easiest.

@Omswastik-11 Omswastik-11 requested a review from fkiraly November 23, 2025 08:51
@Omswastik-11
Copy link
Author

Omswastik-11 commented Nov 23, 2025

Hi @fkiraly !! commited the changes as you suggested

  1. added skforecast to sktime-integration
  2. https://github.com/SimonBlanke/Hyperactive/actions/runs/19598614085/job/56149302140?pr=208
    Modified CI to avoid disk storage issue in CI build
  3. cheked SkforecastExperiment against all tests
image

@JoaquinAmatRodrigo
Copy link

Hi,
I’ve taken a look at the code related to skforecast, and it looks good. Thanks for the work, @Omswastik-11!

@Omswastik-11
Copy link
Author

Omswastik-11 commented Nov 26, 2025

Hi @JoaquinAmatRodrigo @fkiraly !!!
Thanks 👍.
Can you trigger the workflow to see if it works perfectly ?

@JoaquinAmatRodrigo
Copy link

It needs to be triggered by one of the repository's maintainers.

@Omswastik-11
Copy link
Author

Hi @JoaquinAmatRodrigo, @SimonBlanke, and @fkiraly — I’d appreciate your suggestions on how to proceed here.

The issue is that skforecast currently does not support Python 3.14, which affects the new integration.
I am a bit unsure about two things:

  1. Where exactly should skforecast be declared?
    – Only in pyproject.toml under optional extras?
    – Or in the _tags?
  2. When should test:vm be enabled or disabled?

What I have done so far

1. Fixes for the test_examples workflow

Example test run:
https://github.com/SimonBlanke/Hyperactive/actions/runs/19608701548/job/56518055022?pr=208

I checked the examples/ directory and noticed that many dependencies (e.g., torch, tensorflow) were not actually needed.

Changes:

pyproject.toml

  • Added a new test_examples optional dependency group including:

    • pytest
    • integrations used in examples (sklearn, sktime, skforecast)
    • optuna

Makefile

  • Added a new target for running example tests.

test.yml

  • Updated the test-examples job to install only the required libraries .

2. Handling the Python 3.14 compatibility issue

skforecast fails to install on Python 3.14, so I added conditional version constraints:

skforecast; python_version < "3.14"

This has been added to all optional dependency groups that require skforecast.

Related CI runs:
https://github.com/SimonBlanke/Hyperactive/actions/runs/19608701548/job/56518055064?pr=208
https://github.com/SimonBlanke/Hyperactive/actions/runs/19608701548/job/56518055077?pr=208


local testing

image

Thanks!

Copy link
Collaborator

@fkiraly fkiraly left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The integration works, nice!

The remaining issues relate to dependency isolation, you have to do it in two places:

  • the get_test_params function, see above
  • in the tags (I think you also need to add a few others)

Besides this, the "higher is better" property needs to be inferred from the metric and set as a tag in __init__.

@JoaquinAmatRodrigo, is there a programmatic way to do this?

@Omswastik-11 Omswastik-11 requested a review from fkiraly November 27, 2025 12:01
@JoaquinAmatRodrigo
Copy link

The integration works, nice!

The remaining issues relate to dependency isolation, you have to do it in two places:

  • the get_test_params function, see above
  • in the tags (I think you also need to add a few others)

Besides this, the "higher is better" property needs to be inferred from the metric and set as a tag in __init__.

@JoaquinAmatRodrigo, is there a programmatic way to do this?

We do not have a programmatic strategy for this. So far, all the regression metrics that Skforecast allows to be passed as a string are intended to be minimised.

"mean_squared_error": mean_squared_error,
"mean_absolute_error": mean_absolute_error,
"mean_absolute_percentage_error": mean_absolute_percentage_error,
"mean_squared_log_error": mean_squared_log_error,
"mean_absolute_scaled_error": mean_absolute_scaled_error,
"root_mean_squared_scaled_error": root_mean_squared_scaled_error,
"median_absolute_error": median_absolute_error,
"symmetric_mean_absolute_percentage_error": symmetric_mean_absolute_percentage_error

If the user passes a custom function as a metric, they need to indicate whether it is a maximisation or minimisation.

@Omswastik-11
Copy link
Author

Omswastik-11 commented Nov 27, 2025

Hi @JoaquinAmatRodrigo !! Thanks for the clarification .
It means it would be better if we pass the metric in constructor like metric = 'mse' , scoring = 'lower' as default ? where in metric user can pass their own metric functions and change scoring to 'lower ' or 'higher' if they want to minimize or maximize that metric

@JoaquinAmatRodrigo
Copy link

JoaquinAmatRodrigo commented Nov 27, 2025

Looks like a great solution. You might want to double-check the keywords, usually libraries use 'maximize' or 'minimize'. Not sure what is the ones used in this library.

@Omswastik-11
Copy link
Author

Thanks @JoaquinAmatRodrigo !!
In my recent changes I have added a boolean parameter named 'higher_is_better' .
ca013a4

I checked sktime_forecasting.py they used scoring as parameter for both metrics and higher_or_lower part but as we have one separate parameter for metric so i just added a boolean param .
https://github.com/SimonBlanke/Hyperactive/blob/main/src/hyperactive/experiment/integrations/sktime_forecasting.py#L171

@fkiraly and @SimonBlanke any suggestions ?

Copy link
Collaborator

@fkiraly fkiraly left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we making substantial changes to the CI and the depsets? Looks unnecessary. Did an AI suggest this?

Please revert the changes.

I think adding a catch-all all_integrations depset makes sense for testing, but not sure about the rest.

@Omswastik-11
Copy link
Author

Hi @fkiraly !!!
Have a look at this . I explained why I did changes
#208 (comment)

@fkiraly
Copy link
Collaborator

fkiraly commented Nov 28, 2025

Have a look at this . I explained why I did changes

What you say in the comment you mention is not consistent with the actual changes in the pyproject.

If you are using AI, please watch what it is doing.

@Omswastik-11
Copy link
Author

Omswastik-11 commented Nov 28, 2025

Hi @fkiraly
Sorry for the issue It seems I forgot to remove the initial changes . The followings

  - name: Free Disk Space (Ubuntu)
    if: runner.os == 'Linux'
    run: |
      sudo rm -rf /usr/share/dotnet
      sudo rm -rf /usr/local/lib/android
      sudo rm -rf /opt/ghc
      sudo rm -rf /opt/hostedtoolcache/CodeQL
      sudo docker image prune --all --force

rest is the same as I written in comments .

as you mentioned about the changes made in pyproject.toml
This was the commit here I made changes .
3d92878
Can you pin where I did the mistakes ?

all tests are passing currently btw
https://github.com/SimonBlanke/Hyperactive/pull/208/checks?sha=ca013a4c83ce583a8fa0766dd6416cb7e2953e5c

@Omswastik-11 Omswastik-11 requested a review from fkiraly November 28, 2025 07:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request module:integrations Integrations for applying optimization to other libraries

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[ENH] skforecast integration

3 participants