|
1 | | -# Text generation web UI - Modified for macOS and Apple Silicon 2024-05-10 Edition |
| 1 | +# Text generation web UI - Modified for macOS and Apple Silicon 2024-09-15 Edition |
2 | 2 |
|
3 | 3 | ## This is the original oobabooga text generation webui modified to run on macOS |
4 | 4 |
|
5 | 5 | This is a dev release, documentation under re-work, there will probably be changes before final release. |
6 | 6 |
|
7 | | -This is a development version and I have not added many changes I had planned. Please ||feel|| free to use at your own risk as there may be bugs not yet found. |
| 7 | +This is a development version and I have not added many changes I had planned. Please *feel* free to use at your own risk as there may be bugs not yet found. |
8 | 8 |
|
9 | 9 | Items Added to this version. |
10 | 10 | * Added ElevenLabs extension back |
@@ -41,30 +41,13 @@ There are CUDA issues to work out, and I'd like to find a better way around this |
41 | 41 |
|
42 | 42 | | Hardware | Memory | macOS Name | Version | |
43 | 43 | |----------------------------------|--------|------------|---------| |
44 | | -| MacBook Pro 16" M2 Max Processor | 96GB | Sonoma | 14.5 | |
| 44 | +| MacBook Pro 16" M2 Max Processor | 96GB | Sonoma | 14.6.1 | |
45 | 45 |
|
46 | | -- [Text generation web UI - Modified for macOS and Apple Silicon 2024-05-10 Edition](#text-generation-web-ui---modified-for-macos-and-apple-silicon-2024-05-10-edition) |
| 46 | +- [Text generation web UI - Modified for macOS and Apple Silicon 2024-09-15 Edition](#text-generation-web-ui---modified-for-macos-and-apple-silicon-2024-09-15-edition) |
47 | 47 | - [This is the original oobabooga text generation webui modified to run on macOS](#this-is-the-original-oobabooga-text-generation-webui-modified-to-run-on-macos) |
48 | 48 | - [All the features of the UI will run on macOS and have been tested on the following configurations, using only llama.cpp](#all-the-features-of-the-ui-will-run-on-macos-and-have-been-tested-on-the-following-configurations-using-only-llamacpp) |
49 | 49 | - [Features](#features) |
50 | | - - [Installation process](#installation-process) |
51 | | - - [Install Miniconda](#install-miniconda) |
52 | | - - [Download the miniconda installer](#download-the-miniconda-installer) |
53 | | - - [Startup Options](#startup-options) |
54 | | - - [Basic settings](#basic-settings) |
55 | | - - [Model loader](#model-loader) |
56 | | - - [Accelerate/transformers](#acceleratetransformers) |
57 | | - - [bitsandbytes 4-bit](#bitsandbytes-4-bit) |
58 | | - - [llama.cpp](#llamacpp) |
59 | | - - [ExLlamav2](#exllamav2) |
60 | | - - [AutoGPTQ](#autogptq) |
61 | | - - [GPTQ-for-LLaMa](#gptq-for-llama) |
62 | | - - [HQQ](#hqq) |
63 | | - - [DeepSpeed](#deepspeed) |
64 | | - - [RoPE (for llama.cpp, ExLlamaV2, and transformers)](#rope-for-llamacpp-exllamav2-and-transformers) |
65 | | - - [Gradio](#gradio) |
66 | | - - [API](#api) |
67 | | - - [Multimodal](#multimodal) |
| 50 | + - [Installation process Overview](#installation-process-overview) |
68 | 51 | - [Documentation](#documentation) |
69 | 52 | - [Downloading models](#downloading-models) |
70 | 53 | - [Contributing](#contributing) |
@@ -103,168 +86,8 @@ oobabooga's goal is to become the [AUTOMATIC1111/stable-diffusion-webui](https:/ |
103 | 86 |
|
104 | 87 | **Updated Installation Instructions** for libraries in the [oobabooga-macOS Quickstart](https://github.com/unixwzrd/oobabooga-macOS/blob/main/macOS_Apple_Silicon_QuickStart.m1) and the longer [Building Apple Silicon Support](https://github.com/unixwzrd/oobabooga-macOS/blob/main/macOS-Install.md) |
105 | 88 |
|
106 | | -```bash |
107 | | -#!/bin/bash |
108 | | -## These instructions assume you are using the Bash shell. I also sugget getting a copy |
109 | | -## of iTerm2, it will make your life better, iut is much better than the default terminal |
110 | | -## on macOS. |
111 | | -## |
112 | | -## If you are using zsh, do this first, do it even if you are running bash, |
113 | | -## it will not hurt anything. |
114 | | - |
115 | | -## This will give you a login shell with bash. |
116 | | -exec bash -l |
117 | | - |
118 | | -cd "${HOME}" |
119 | | - |
120 | | -umask 022 |
121 | | - |
122 | | -### Choose a target directory for everything to be put into, I'm using "${HOME}/projects/ai-projects" You |
123 | | -### may use whatever you wish. This must be exported because we will exec a new login shell later. |
124 | | -export TARGET_DIR="${HOME}/projects/ai-projects" |
125 | | - |
126 | | -mkdir -p "${TARGET_DIR}" |
127 | | -cd "${TARGET_DIR}" |
128 | | - |
129 | | -# This will add to your path and DYLD_LIBRARY_PATH if they aren't already seyt up. |
130 | | -# export PATH=${HOME}/local/bin |
131 | | -# export DYLD_LIBRARY_PATH=${HOME}/local/lib:$DYLD_LIBRARY_PATH |
132 | | - |
133 | | -### Be sure to add ${HOME}/local/bin to your path **Add to your .profile, .bashrc, etc...** |
134 | | -export PATH=${HOME}/local/bin:${PATH} |
| 89 | + **Updated Long Version of oobabooga-macOS Installation Instructions** for libraries in the [Building Apple Silicon Support for oobabooga text-generation-webui](https://github.com/unixwzrd/oobabooga-macOS/blob/main/macOS-Install.md) |
135 | 90 |
|
136 | | -### Thwe following Sed line will add it permanantly to your .bashrc if it's not already there. |
137 | | -sed -i.bak ' |
138 | | - /export PATH=/ { |
139 | | - h; s|$|:${HOME}/local/bin| |
140 | | - } |
141 | | - ${ |
142 | | - x; /./ { x; q0 } |
143 | | - x; s|.*|export PATH=${HOME}/local/bin:\$PATH|; h |
144 | | - } |
145 | | - /export DYLD_LIBRARY_PATH=/ { |
146 | | - h; s|$|:${HOME}/local/lib| |
147 | | - } |
148 | | - ${ |
149 | | - x; /./ { x; q0 } |
150 | | - x; s|.*|export DYLD_LIBRARY_PATH=${HOME}/local/lib:\$SYLD_LIBRARY_PATH|; h |
151 | | - } |
152 | | -' ~/.bashrc && source ~/.bashrc |
153 | | - |
154 | | -## Install Miniconda |
155 | | - |
156 | | -### Download the miniconda installer |
157 | | -curl https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-arm64.sh -o miniconda.sh |
158 | | - |
159 | | -### Run the installer in non-destructive mode in order to preserve any existing installation. |
160 | | -sh miniconda.sh -b -u |
161 | | -. "${HOME}/miniconda3/bin/activate" |
162 | | - |
163 | | -conda init $(basename "${SHELL}") |
164 | | -conda update -n base -c defaults conda -y |
165 | | - |
166 | | -#### Get a new login shell no that conda is activated to your shell profile. |
167 | | -exec bash -l |
168 | | - |
169 | | -umask 022 |
170 | | - |
171 | | -#### Just in case your startup login environment scripts do some thing like change to another directory. |
172 | | -#### Get back into teh target directory for teh build. |
173 | | -cd "${TARGET_DIR}" |
174 | | - |
175 | | -#### Set the name of the VENV to whatever you wish it to be. This will be used later when the procedure |
176 | | -#### creates a script for sourcing in the Conda environment and activating the one set here when you installed. |
177 | | -export MACOS_LLAMA_ENV="macOS-llama-env" |
178 | | - |
179 | | -#### Create the base Python 3.10 and the llama-env VENV. |
180 | | -conda create -n ${MACOS_LLAMA_ENV} python=3.10 -y |
181 | | -conda activate ${MACOS_LLAMA_ENV} |
182 | | - |
183 | | -## Build and install CMake |
184 | | - |
185 | | -### Clone the CMake repository, build, and install CMake |
186 | | -git clone https://github.com/Kitware/CMake.git |
187 | | -cd CMake |
188 | | -git checkout tags/v3.29.3 |
189 | | -mkdir build |
190 | | -cd build |
191 | | - |
192 | | -### This will configure the installation of cmake to be in your home directory under local, rather than /usr/local |
193 | | -../bootstrap --prefix=${HOME}/local |
194 | | -make -j |
195 | | -make -j test |
196 | | -make install |
197 | | - |
198 | | -### Verify the installation |
199 | | -which cmake # Should say $HOME/local/bin |
200 | | -### Verify you are running cmake z3.29.3 |
201 | | -cmake --version |
202 | | -cd "${TARGET_DIR}" |
203 | | - |
204 | | - |
205 | | -## Get my oobabooga and checkout macOS-test branch |
206 | | -git clone https://github.com/unixwzrd/text-generation-webui-macos.git textgen-macOS |
207 | | -cd textgen-macOS |
208 | | -git checkout main |
209 | | -pip install -r requirements.txt |
210 | | - |
211 | | -## llamacpp-python |
212 | | -export CMAKE_ARGS="-DLLAMA_METAL=on" |
213 | | -export FORCE_CMAKE=1 |
214 | | -export PATH=/usr/local/bin:$PATH # Ensure the correct cmake is used |
215 | | -pip install llama-cpp-python --force-reinstall --no-cache --no-binary :all: --compile --no-deps --no-build-isolation |
216 | | - |
217 | | -## Pip install from daily build |
218 | | -pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu --force-reinstall --no-deps |
219 | | - |
220 | | -## NumPy Rebuild with Pip |
221 | | -export CFLAGS="-I/System/Library/Frameworks/vecLib.framework/Headers -Wl,-framework -Wl,Accelerate -framework Accelerate" |
222 | | -pip install numpy==1.26.* --force-reinstall --no-deps --no-cache --no-binary :all: --no-build-isolation --compile -Csetup-args=-Dblas=accelerate -Csetup-args=-Dlapack=accelerate -Csetup-args=-Duse-ilp64=true |
223 | | - |
224 | | -## CTransformers |
225 | | -export CFLAGS="-I/System/Library/Frameworks/vecLib.framework/Headers -Wl,-framework -Wl,Accelerate -framework Accelerate" |
226 | | -export CT_METAL=1 |
227 | | -pip install ctransformers --no-binary :all: --no-deps --no-build-isolation --compile --force-reinstall |
228 | | - |
229 | | -### Unset all the stuff we set while building. |
230 | | -unset CMAKE_ARGS FORCE_CMAKE CFLAGS CT_METAL |
231 | | - |
232 | | - |
233 | | -## This will create a startup script whcih shoudl be clickable in finder. |
234 | | - |
235 | | -### Set the startup options you wish to use |
236 | | - |
237 | | -# Add any startup options you wich to this here: |
238 | | -START_OPTIONS= |
239 | | -#START_OPTIONS="--verbose " |
240 | | -#START_OPTIONS="--verbose --listen" |
241 | | - |
242 | | -cat <<_EOT_ > start-webui.sh |
243 | | -#!/bin/bash |
244 | | -
|
245 | | -# >>> conda initialize >>> |
246 | | -__conda_setup="$('${HOME}/miniconda3/bin/conda' 'shell.bash' 'hook' 2> /dev/null)" |
247 | | -if [ $? -eq 0 ]; then |
248 | | - eval "$__conda_setup" |
249 | | -else |
250 | | - if [ -f "${HOME}/miniconda3/etc/profile.d/conda.sh" ]; then |
251 | | - . "${HOME}/miniconda3/etc/profile.d/conda.sh" |
252 | | - else |
253 | | - export PATH="${HOME}/miniconda3/bin:$PATH" |
254 | | - fi |
255 | | -fi |
256 | | -unset __conda_setup |
257 | | -# <<< conda initialize <<< |
258 | | -
|
259 | | -cd "${TARGET_DIR}/textgen-macOS" |
260 | | -
|
261 | | -conda activate ${MACOS_LLAMA_ENV} |
262 | | -
|
263 | | -python server.py ${START_OPTIONS} |
264 | | -_EOT_ |
265 | | - |
266 | | - |
267 | | -chmod +x start-webui.sh``` |
268 | 91 | <details> |
269 | 92 | <summary> |
270 | 93 | <b>List of command-line flags</b> |
|
0 commit comments