You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/installation/manual.md
+16-16Lines changed: 16 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -69,34 +69,34 @@ The following commands vary depending on the version of Invoke being installed a
69
69
- If you have an Nvidia 20xx series GPU or older, use `invokeai[xformers]`.
70
70
- If you have an Nvidia 30xx series GPU or newer, or do not have an Nvidia GPU, use `invokeai`.
71
71
72
-
7. Determine the `PyPI` index URL to use for installation, if any. This is necessary to get the right version of torch installed.
72
+
7. Determine the torch backend to use for installation, if any. This is necessary to get the right version of torch installed. This is acheived by using (UV's built in torch support.)[https://docs.astral.sh/uv/guides/integration/pytorch/#automatic-backend-selection]
73
73
74
74
==="Invoke v5.12 and later"
75
75
76
-
- If you are on Windows or Linux with an Nvidia GPU, use `https://download.pytorch.org/whl/cu128`.
77
-
- If you are on Linux with no GPU, use `https://download.pytorch.org/whl/cpu`.
78
-
- If you are on Linux with an AMD GPU, use `https://download.pytorch.org/whl/rocm6.2.4`.
79
-
- **In all other cases, do not use an index.**
76
+
- If you are on Windows or Linux with an Nvidia GPU, use `--torch-backend=cu128`.
77
+
- If you are on Linux with no GPU, use `--torch-backend=cpu`.
78
+
- If you are on Linux with an AMD GPU, use `--torch-backend=rocm6.3`.
79
+
- **In all other cases, do not use a torch backend.**
80
80
81
81
==="Invoke v5.10.0 to v5.11.0"
82
82
83
-
- If you are on Windows or Linux with an Nvidia GPU, use `https://download.pytorch.org/whl/cu126`.
84
-
- If you are on Linux with no GPU, use `https://download.pytorch.org/whl/cpu`.
85
-
- If you are on Linux with an AMD GPU, use `https://download.pytorch.org/whl/rocm6.2.4`.
83
+
- If you are on Windows or Linux with an Nvidia GPU, use `--torch-backend=cu126`.
84
+
- If you are on Linux with no GPU, use `--torch-backend=cpu`.
85
+
- If you are on Linux with an AMD GPU, use `--torch-backend=rocm6.2.4`.
86
86
- **In all other cases, do not use an index.**
87
87
88
88
==="Invoke v5.0.0 to v5.9.1"
89
89
90
-
- If you are on Windows with an Nvidia GPU, use `https://download.pytorch.org/whl/cu124`.
91
-
- If you are on Linux with no GPU, use `https://download.pytorch.org/whl/cpu`.
92
-
- If you are on Linux with an AMD GPU, use `https://download.pytorch.org/whl/rocm6.1`.
90
+
- If you are on Windows with an Nvidia GPU, use `--torch-backend=cu124`.
91
+
- If you are on Linux with no GPU, use `--torch-backend=cpu`.
92
+
- If you are on Linux with an AMD GPU, use `--torch-backend=rocm6.1`.
93
93
- **In all other cases, do not use an index.**
94
94
95
95
==="Invoke v4"
96
96
97
-
- If you are on Windows with an Nvidia GPU, use `https://download.pytorch.org/whl/cu124`.
98
-
- If you are on Linux with no GPU, use `https://download.pytorch.org/whl/cpu`.
99
-
- If you are on Linux with an AMD GPU, use `https://download.pytorch.org/whl/rocm5.2`.
97
+
- If you are on Windows with an Nvidia GPU, use `--torch-backend=cu124`.
98
+
- If you are on Linux with no GPU, use `--torch-backend=cpu`.
99
+
- If you are on Linux with an AMD GPU, use `--torch-backend=rocm5.2`.
100
100
- **In all other cases, do not use an index.**
101
101
102
102
8. Install the `invokeai` package. Substitute the package specifier and version.
@@ -105,10 +105,10 @@ The following commands vary depending on the version of Invoke being installed a
0 commit comments