- Uses task-specific VAEs to replay prior knowledge during continual learning on Split CIFAR-100.
- Covers both incremental training and post-hoc unlearning of selected tasks.
- Historic experiment summaries are stored in
results/.
src/generative_model_unlearning/– package with data preparation, generative models, and runners.results/– accuracy logs from earlier experiments.requirements.txt– dependencies shared by training scripts.
cd repositories/Generative-Continual-Learning-PyTorchpython -m venv .venv && source .venv/bin/activatepip install -r requirements.txtexport PYTHONPATH=src- Train with generative replay:
python -m generative_model_unlearning.run_project_c
export PYTHONPATH=src
python -m generative_model_unlearning.generative_unlearning
One-line: research codebase for continual learning experiments using task-specific generative replay (VAEs) on Split CIFAR-100, including post-hoc unlearning experiments.
This repository contains code used to run and reproduce experiments for generative-model aided continual learning and targeted unlearning of learned tasks.
- Code:
src/generative_model_unlearning/ - Data:
data/(CIFAR-100 stored underdata/cifar-100-python/) - Results and logs:
results/ - Python deps:
requirements.txt
- Python 3.10+ recommended
- PyTorch 2.0+ and torchvision
Install dependencies (recommended inside a virtualenv):
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txtSet PYTHONPATH so tests and runners can import the package in src:
export PYTHONPATH=$(pwd)/src- src/generative_model_unlearning/ — core package
- data_setup.py — CIFAR-100 loading and dataset helpers
- generative_models.py — VAE and related generator code
- generative_unlearning.py — scripts to run selective unlearning experiments
- run_project_c.py — example training/run entrypoint
- data/ — external dataset files (not tracked in git)
- results/ — experiment outputs (not tracked in git)
- requirements.txt — pinned runtime dependencies
If you add new data or large model checkpoints, place them under the data/ or results/ directories; these are ignored by git by default.
Train a model (example):
# from repository root
export PYTHONPATH=$(pwd)/src
python -m generative_model_unlearning.run_project_cRun the unlearning demo:
export PYTHONPATH=$(pwd)/src
python -m generative_model_unlearning.generative_unlearningAdd or change command-line args in the if __name__ == '__main__' blocks of the modules above to customize dataset roots, epochs, or save locations.
- Follow the Python packaging layout: keep code under
src/to simplify imports. - Create a virtual environment (see above).
- Run unit or smoke tests (not currently provided) by creating a
tests/directory and usingpytest.
Suggested small improvements you can add:
- Add a
setup.cfgandpyproject.tomlfor tooling (ruff, black, pytest). - Add a lightweight test that loads the dataset and runs a single training step to catch API regressions.