Commit 9eb7875
Fix freezing modules in Ghost Clipping (#729)
Summary:
Freezing modules with ghost clipping throws an error as corresponding per-sample norms are (not) calculated. Fix: keep in memory the list of all parameters and checking if corresponding requires_grad is True when calculating norms.
Further, unfreezing modules (with and without ghost clipping) wasn't supported because the hooks aren't present for the corresponding modules. Fix: rewrite `requires_grad_' to add the hook.
Facebook
We initially used a `trainable_parameters(module)` to traverse the list of trainable modules upon norm computation. It was slow because `trainable_parameters(module)` is a generator and it traverses the neural network graph overtime.
We replaced it with a list of trainable parameters fixed during model creation time. This is what lead to issues with freezing modules as this list is not updated.
Fix: Use **all parameters** **list** -- not a generator, so no traversal happens. Further, we check `requires_grad` when calculating per-sample norm to ascertain whether to compute it or not. This is how this check is done in (non-private) [optimizer](https://github.com/pytorch/pytorch/blob/5725462cd8679dd1dea8a469b1bf2e71f226b664/torch/optim/optimizer.py#L963) to determine which parameters are frozen or not.
Differential Revision: D686564591 parent 0d186a4 commit 9eb7875
File tree
2 files changed
+21
-2
lines changed- opacus/grad_sample
2 files changed
+21
-2
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
145 | 145 | | |
146 | 146 | | |
147 | 147 | | |
| 148 | + | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
148 | 162 | | |
149 | 163 | | |
150 | 164 | | |
| |||
Lines changed: 7 additions & 2 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
117 | 117 | | |
118 | 118 | | |
119 | 119 | | |
120 | | - | |
| 120 | + | |
121 | 121 | | |
122 | 122 | | |
123 | 123 | | |
| |||
130 | 130 | | |
131 | 131 | | |
132 | 132 | | |
133 | | - | |
| 133 | + | |
| 134 | + | |
| 135 | + | |
| 136 | + | |
| 137 | + | |
| 138 | + | |
134 | 139 | | |
135 | 140 | | |
136 | 141 | | |
| |||
0 commit comments