-
Notifications
You must be signed in to change notification settings - Fork 0
General features #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
michaelflppv
wants to merge
4
commits into
main
Choose a base branch
from
general-features
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
4 commits
Select commit
Hold shift + click to select a range
0956fbc
Update CMakeLists.txt and install.py for improved build configuration…
michaelflppv 510e6e4
Add Makefile for build automation and update README.md for usage inst…
michaelflppv 20d9481
Add Makefile for build automation and update README.md for usage inst…
michaelflppv d702e4c
Update README.md to add instructions for downloading TU datasets and …
michaelflppv File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,100 @@ | ||
| SHELL := /bin/bash | ||
| .SHELLFLAGS := -eu -o pipefail -c | ||
|
|
||
| POETRY ?= poetry | ||
| PYTHON ?= python | ||
| POETRY_RUN ?= $(POETRY) run | ||
| RUN_PY ?= $(POETRY_RUN) $(PYTHON) | ||
| CMAKE ?= cmake | ||
| GEDLIB_BUILD_DIR ?= gedlib/build | ||
| GEDLIB_LIB ?= gxl | ||
| GXL_DATASET ?= aids | ||
| GXL_OPTIONS := aids imdb proteins | ||
| DATA_ROOT ?= data | ||
| TUD_ROOT ?= $(DATA_ROOT)/_tud | ||
| DATASETS ?= AIDS IMDB-BINARY PROTEINS | ||
|
|
||
| .PHONY: help gedlib-install gedlib-configure gedlib-build gedlib-all gedlib-test \ | ||
| convert-gxl convert-json convert-txt lower-bound lower-bound-validate \ | ||
| exact-ged gedlib-run gedlib-edit-path simgnn-edit-path apply-edit-path \ | ||
| gedlib-validate-path simgnn-validate-path generate-gxl-collection \ | ||
| generate-json-pairs simgnn-train simgnn-eval test-python install-datasets | ||
|
|
||
| help: ## Show available make targets | ||
| @printf "Available targets:\\n" | ||
| @grep -E '^[a-zA-Z0-9_.-]+:.*?##' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?##"} {printf " %-24s %s\n", $$1, $$2}' | ||
|
|
||
| gedlib-install: ## Run GEDLIB install script with cleanup and docs (set GEDLIB_LIB=gxl|... as needed) | ||
| cd gedlib && $(PYTHON) install.py --clean --doc --lib $(GEDLIB_LIB) | ||
|
|
||
| gedlib-configure: ## Configure GEDLIB CMake build directory | ||
| $(CMAKE) -S gedlib -B $(GEDLIB_BUILD_DIR) | ||
|
|
||
| gedlib-build: gedlib-configure ## Build GEDLIB via CMake | ||
| $(CMAKE) --build $(GEDLIB_BUILD_DIR) | ||
|
|
||
| gedlib-all: ## Install and build GEDLIB | ||
| $(MAKE) gedlib-install | ||
| $(MAKE) gedlib-build | ||
|
|
||
| gedlib-test: ## Run GEDLIB ctest suite after building | ||
| cd $(GEDLIB_BUILD_DIR) && ctest | ||
|
|
||
| convert-gxl: ## Convert a dataset to GXL/XML for GEDLIB (set GXL_DATASET=$(GXL_OPTIONS)) | ||
| cd src/converters/gxl_xml && \ | ||
| if [ ! -f preprocess_$(GXL_DATASET).py ]; then \ | ||
| echo "Unknown dataset '$(GXL_DATASET)'. Choose from: $(GXL_OPTIONS)"; \ | ||
| exit 1; \ | ||
| fi; \ | ||
| $(RUN_PY) preprocess_$(GXL_DATASET).py | ||
|
|
||
| convert-json: ## Convert datasets to JSON pairs for SimGNN | ||
| cd src/converters/json && $(RUN_PY) preprocess_all.py | ||
|
|
||
| convert-txt: ## Convert datasets to TXT graph pairs | ||
| cd src/converters/txt && $(RUN_PY) preprocess_all.py | ||
|
|
||
| lower-bound: ## Estimate lower bounds for graph pairs | ||
| cd heuristics && $(RUN_PY) estimate_lower_bound.py | ||
|
|
||
| lower-bound-validate: ## Validate lower bound estimations | ||
| cd heuristics && $(RUN_PY) validate_lower_bounds.py | ||
|
|
||
| exact-ged: ## Compute exact GED using AStar-BMao parser | ||
| cd src/c++_parsers && $(RUN_PY) astar_exact_ged.py | ||
|
|
||
| gedlib-run: ## Run GEDLIB parser for approximate GED | ||
| cd src/c++_parsers && $(RUN_PY) gedlib_parser.py | ||
|
|
||
| gedlib-edit-path: ## Extract GEDLIB edit paths | ||
| cd src/c++_parsers && $(RUN_PY) gedlib_edit_path.py | ||
|
|
||
| simgnn-edit-path: ## Extract SimGNN edit paths | ||
| cd SimGNN/src && $(RUN_PY) simgnn_extract_edit_path.py | ||
|
|
||
| apply-edit-path: ## Apply edit paths to simulate edits | ||
| cd src/edit_path_test && $(RUN_PY) apply_edit_path.py | ||
|
|
||
| gedlib-validate-path: ## Validate GEDLIB edit paths | ||
| cd src/edit_path_test/test && $(RUN_PY) gedlib_validate_edit_path.py | ||
|
|
||
| simgnn-validate-path: ## Validate SimGNN edit paths | ||
| cd SimGNN/src && $(RUN_PY) simgnn_validate_edit_path.py | ||
|
|
||
| generate-gxl-collection: ## Generate synthetic GXL collection for edit-path testing | ||
| cd src/edit_path_test/generate_synthetic_graphs && $(RUN_PY) generate_gxl_collection.py | ||
|
|
||
| generate-json-pairs: ## Generate synthetic JSON pairs for edit-path testing | ||
| cd src/edit_path_test/generate_synthetic_graphs && $(RUN_PY) generate_json_pairs.py | ||
|
|
||
| simgnn-train: ## Train SimGNN model | ||
| cd SimGNN/src && $(RUN_PY) main.py | ||
|
|
||
| simgnn-eval: ## Evaluate SimGNN model | ||
| cd SimGNN/src && $(RUN_PY) simgnn_evaluate.py | ||
|
|
||
| test-python: ## Run pytest suite | ||
| $(RUN_PY) -m pytest | ||
|
|
||
| install-datasets: ## Download TU datasets (any torch_geometric TUDataset name) into data/<dataset> | ||
| $(RUN_PY) scripts/install_datasets.py --datasets $(DATASETS) --download-root $(TUD_ROOT) --target-root $(DATA_ROOT) | ||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The new
install-datasetstarget calls$(RUN_PY) scripts/install_datasets.py, but there is noscripts/install_datasets.pyanywhere in the repository (checked withfind . -name install_datasets.py). Runningmake install-datasetsas documented will therefore fail immediately withpython: can't open file 'scripts/install_datasets.py', leaving the advertised dataset download workflow unusable.Useful? React with 👍 / 👎.