Skip to content

Commit 5b13326

Browse files
payotoIan Hales
authored andcommitted
Update gradient urls to SDK 3.1 (need cherry-pick) (#418)
1 parent 97e22b1 commit 5b13326

File tree

4 files changed

+22
-22
lines changed

4 files changed

+22
-22
lines changed

gnn/cluster_gcn/tensorflow2/README.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Cluster graph convolutional networks for node classification, using cluster samp
33

44
Run our Cluster GCN training on arXiv dataset on Paperspace.
55
<br>
6-
[![Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/3UYkV6d)
6+
[![Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/3CHtqfy)
77

88
| Framework | domain | Model | Datasets | Tasks| Training| Inference | Reference |
99
|-------------|-|------|-------|-------|-------|---|---|
@@ -29,7 +29,7 @@ If no path is provided, then follow these steps:
2929
1. Navigate to your Poplar SDK root directory
3030

3131
2. Enable the Poplar SDK with:
32-
```bash
32+
```bash
3333
cd poplar-<OS version>-<SDK version>-<hash>
3434
. enable.sh
3535
```
@@ -83,10 +83,10 @@ these datasets can be selected in the config by setting `dataset_type`.
8383

8484
### PPI (Protein-protein interactions) dataset <a name='ppi' ></a>
8585

86-
The [PPI dataset](https://paperswithcode.com/dataset/ppi) depicts protein roles in various protein-protein
86+
The [PPI dataset](https://paperswithcode.com/dataset/ppi) depicts protein roles in various protein-protein
8787
interaction (PPI) graphs. Each graph in the datasets corresponds to a different human tissue. Positional gene sets are
88-
used, motif gene sets and immunological signatures as features and gene ontology sets as multi-class binary labels
89-
(121 in total). The dataset contains in total 56944 nodes, 818716 edges and node feature size 50. The preprocessed PPI
88+
used, motif gene sets and immunological signatures as features and gene ontology sets as multi-class binary labels
89+
(121 in total). The dataset contains in total 56944 nodes, 818716 edges and node feature size 50. The preprocessed PPI
9090
datasets can be downloaded from [Stanford GraphSAGE](https://snap.stanford.edu/graphsage).
9191

9292
### Reddit dataset <a name='reddit' ></a>
@@ -121,9 +121,9 @@ for test. To use this dataset, simply use the train_products.json config, the da
121121
The [ogbn-mag dataset](https://ogb.stanford.edu/docs/nodeprop/#ogbn-mag) is a directed heterogeneous graph that is
122122
a subset of Microsoft Academic Graph (MAG). The node types are papers, authors, institutions and fields of study.
123123
These are connected by four edge types, author affiliated with institution, author writes a paper, paper cites a paper,
124-
and paper has a topic of a field of study. Each paper has a 128-dimmensional node feature vector, that
124+
and paper has a topic of a field of study. Each paper has a 128-dimmensional node feature vector, that
125125
encodes the title and abstract, similar to ogbn-arxiv. The task is to predict the venue of each paper. The train
126-
portion of the dataset is all papers published until 2017, the papers published in 2018 are the validation
126+
portion of the dataset is all papers published until 2017, the papers published in 2018 are the validation
127127
set, and papers published in 2019 are the test set. To use this dataset, simply use the train_mag.json config,
128128
the dataset will be downloaded automatically.
129129

@@ -144,12 +144,12 @@ which uses PCA to reduce the feature size. There a script is provided to pre-pro
144144
download the pre-processed data directly, which you can download from
145145
[DeepMind’s cloud storage](https://storage.googleapis.com/deepmind-ogb-lsc/mag/data/preprocessed/merged_feat_from_paper_feat_pca_129.npy).
146146
Note that the dataset is licensed under ODC-BY.
147-
The path for the downloaded `.npy` file can be given in the `train_mag240.json` config under the `pca_features_path`
148-
parameter, which is expected relative to the data path.
147+
The path for the downloaded `.npy` file can be given in the `train_mag240.json` config under the `pca_features_path`
148+
parameter, which is expected relative to the data path.
149149
The other parts of the dataset, for example the edges, nodes and labels, will be downloaded automatically when running
150150
the application. The dataset is around 200Gb so can take some time to download (a few hours to a day). The path of this
151-
can be given in the `train_mag240.json` config under `data_path`, or with the `--data-path` argument in the command
152-
line.
151+
can be given in the `train_mag240.json` config under `data_path`, or with the `--data-path` argument in the command
152+
line.
153153
For example, the following configuration will load the data from or download to directory
154154
`/graph-datasets/ogb-lsc-mag240`, and will attempt to load the PCA features from file
155155
`/graph-datasets/ogb-lsc-mag240/mag240m_kddcup2021/merged_feat_from_paper_feat_pca_129.npy`:

gnn/tgn/pytorch/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Temporal graph networks for link prediction in dynamic graphs, based on [`exampl
44

55
Run our TGN on paperspace.
66
<br>
7-
[![Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/3uUI2nt)
7+
[![Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/3CG1WqL)
88

99
| Framework | domain | Model | Datasets | Tasks| Training| Inference | Reference |
1010
|-------------|-|------|-------|-------|-------|---|---|
@@ -28,13 +28,13 @@ If no path is provided, then follow these steps:
2828
1. Navigate to your Poplar SDK root directory
2929

3030
2. Enable the Poplar SDK with:
31-
```bash
31+
```bash
3232
cd poplar-<OS version>-<SDK version>-<hash>
3333
. enable.sh
3434
```
3535

3636
3. Additionally, enable PopArt with:
37-
```bash
37+
```bash
3838
cd popart-<OS version>-<SDK version>-<hash>
3939
. enable.sh
4040
```

nlp/bert/pytorch/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,11 +3,11 @@ Bidirectional Encoder Representations from Transformers for NLP pre-training and
33

44
Run our BERT-L Fine-tuning on SQuAD dataset on Paperspace.
55
<br>
6-
[![Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/3WiyZIC)
6+
[![Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/3GTWwK7)
77

88
| Framework | domain | Model | Datasets | Tasks| Training| Inference | Reference |
99
|-------------|-|------|-------|-------|-------|---|---|
10-
| Pytorch | NLP | BERT | WIKI-103 | Next sentence prediction, Masked language modelling, Question/Answering ||| [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805v2) |
10+
| Pytorch | NLP | BERT | WIKI-103 | Next sentence prediction, Masked language modelling, Question/Answering ||| [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805v2) |
1111

1212

1313
## Instructions summary
@@ -29,13 +29,13 @@ If no path is provided, then follow these steps:
2929
1. Navigate to your Poplar SDK root directory
3030

3131
2. Enable the Poplar SDK with:
32-
```bash
32+
```bash
3333
cd poplar-<OS version>-<SDK version>-<hash>
3434
. enable.sh
3535
```
3636

3737
3. Additionally, enable PopArt with:
38-
```bash
38+
```bash
3939
cd popart-<OS version>-<SDK version>-<hash>
4040
. enable.sh
4141
```

vision/vit/pytorch/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Vision Transformer for image recognition, optimised for Graphcore's IPU. Based
33

44
Run our ViT on Paperspace.
55
<br>
6-
[![Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/3uTF5Uj)
6+
[![Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/3W2Ru39)
77

88
| Framework | domain | Model | Datasets | Tasks| Training| Inference | Reference |
99
|-------------|-|------|-------|-------|-------|---|-------|
@@ -29,13 +29,13 @@ If no path is provided, then follow these steps:
2929
1. Navigate to your Poplar SDK root directory
3030

3131
2. Enable the Poplar SDK with:
32-
```bash
32+
```bash
3333
cd poplar-<OS version>-<SDK version>-<hash>
3434
. enable.sh
3535
```
3636

3737
3. Additionally, enable PopArt with:
38-
```bash
38+
```bash
3939
cd popart-<OS version>-<SDK version>-<hash>
4040
. enable.sh
4141
```
@@ -132,7 +132,7 @@ python validation.py --config b16_imagenet1k_valid
132132
```
133133
### Employing automatic loss scaling (ALS) for half precision training
134134

135-
ALS is a feature in the Poplar SDK which brings stability to training large models in half precision, specially when gradient accumulation and reduction across replicas also happen in half precision.
135+
ALS is a feature in the Poplar SDK which brings stability to training large models in half precision, specially when gradient accumulation and reduction across replicas also happen in half precision.
136136

137137
NB. This feature expects the `poptorch` training option `accumulationAndReplicationReductionType` to be set to `poptorch.ReductionType.Mean`, and for accumulation by the optimizer to be done in half precision (using `accum_type=torch.float16` when instantiating the optimizer), or else it may lead to unexpected behaviour.
138138

0 commit comments

Comments
 (0)