From ea27a775b8e149892c11b5f71ba2aec8a4543bf2 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E8=AE=B8=E6=B1=9F=E6=B3=BD?= Date: Mon, 25 Mar 2024 13:47:54 +0800 Subject: [PATCH 1/6] =?UTF-8?q?feat:=20=E8=BF=BD=E5=8A=A0miniconda?= =?UTF-8?q?=E5=AE=89=E8=A3=85=E6=96=87=E6=A1=A3?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- .gitignore | 1 + ...\243\205\347\256\241\347\220\206python.md" | 70 +++++++++++++++++++ 2 files changed, 71 insertions(+) create mode 100644 "docs/FAQ/miniconda\345\256\211\350\243\205\347\256\241\347\220\206python.md" diff --git a/.gitignore b/.gitignore index 9334f368..1b52a651 100644 --- a/.gitignore +++ b/.gitignore @@ -2,6 +2,7 @@ __pycache__/ *.py[cod] *$py.class +.idea # C extensions *.so diff --git "a/docs/FAQ/miniconda\345\256\211\350\243\205\347\256\241\347\220\206python.md" "b/docs/FAQ/miniconda\345\256\211\350\243\205\347\256\241\347\220\206python.md" new file mode 100644 index 00000000..18338f1a --- /dev/null +++ "b/docs/FAQ/miniconda\345\256\211\350\243\205\347\256\241\347\220\206python.md" @@ -0,0 +1,70 @@ +# 说明 +本文介绍如何安装miniconda +并且基于miniconda安装python环境 + +# 官方介绍 +https://docs.anaconda.com/free/miniconda/ +## windows安装 +```bash +curl https://repo.anaconda.com/miniconda/Miniconda3-latest-Windows-x86_64.exe -o miniconda.exe +start /wait "" miniconda.exe /S +del miniconda.exe +``` +## mac安装 +```bash +mkdir -p ~/miniconda3 +curl https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-arm64.sh -o ~/miniconda3/miniconda.sh +bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3 +rm -rf ~/miniconda3/miniconda.sh + +``` +安装完毕之后, 以下命令针对 bash 和 zsh shell 进行初始化 +```bash +~/miniconda3/bin/conda init bash +~/miniconda3/bin/conda init zsh +``` +## linux安装 +```bash +mkdir -p ~/miniconda3 +wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh +bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3 +rm -rf ~/miniconda3/miniconda.sh +``` +安装完毕之后, 以下命令针对 bash 和 zsh shell 进行初始化 +```bash +~/miniconda3/bin/conda init bash +~/miniconda3/bin/conda init zsh +``` + + +# 使用MiniConda管理python环境 +- 安装python3.10.13 +假设我需要有一个环境叫myenv(你也可以叫其他名字), 并且指定python版本为3.10.13 +```bash +conda create --name myenv python=3.10.13 +``` + +- 激活环境, 并安装依赖文件requirement.txt +```bash +conda activate myenv +pip install -r requirements.txt +``` +ps: 如果遇到下载超时或者失败, 更换源 +```bash +pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple +``` + +- 解除激活环境 +```bash +conda deactivate +``` + +- 删除环境 +```bash +conda remove --name myenv --all +``` + + + + + From 74b0541e6a26aaf78a36161537258fad891017f8 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E8=AE=B8=E6=B1=9F=E6=B3=BD?= Date: Mon, 25 Mar 2024 13:48:28 +0800 Subject: [PATCH 2/6] =?UTF-8?q?feat:=20=E8=BF=BD=E5=8A=A0readme.md?= =?UTF-8?q?=E6=95=99=E7=A8=8B?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- openai-translator/README-CN.md | 55 +++++++++++++--------------- openai-translator/README.md | 59 +++++++++++++++--------------- openai-translator/requirements.txt | 45 ++++++++++++++++++----- 3 files changed, 91 insertions(+), 68 deletions(-) diff --git a/openai-translator/README-CN.md b/openai-translator/README-CN.md index 00d890c1..693c2559 100644 --- a/openai-translator/README-CN.md +++ b/openai-translator/README-CN.md @@ -44,24 +44,37 @@ OpenAI 翻译器目前还处于早期开发阶段,我正在积极地添加更 ### 环境准备 -1.克隆仓库 `git clone git@github.com:DjangoPeng/openai-translator.git`。 - -2.OpenAI-翻译器 需要 Python 3.6 或更高版本。使用 `pip install -r requirements.txt` 安装依赖项。 - -3.设置您的 OpenAI API 密钥(`$OPENAI_API_KEY`)或 ChatGLM 模型 URL(`$GLM_MODEL_URL`)。您可以将其添加到环境变量中,或者在 config.yaml 文件中指定。 +1. 克隆仓库代码 +```bash +git clone git@github.com:DjangoPeng/openai-translator.git +``` -### 使用示例 +2. 准备python环境. +- 版本要求 **python > 3.10.13** + - 无独立环境请移步 [miniconda安装管理python](../docs/FAQ/miniconda%E5%AE%89%E8%A3%85python.md) +- 安装依赖(确保你已经使用miniConda创建了python环境myenv, 并且激活了myenv环境) +```bash +pip install -r requirements.txt +``` -您可以通过指定配置文件或提供命令行参数来使用 OpenAI-翻译器。 +3. 启动翻译程序 -#### 使用配置文件 +下列启动方式多选一即可 +- 命令行方式启动(推荐), 使用OpenAI模型 +```bash +# 把您的 OPENAI_API_KEY 替换为你具体的API_KEY +export OPENAI_API_KEY="sk-xxx" +python ai_translator/main.py --model_type OpenAIModel --openai_api_key $OPENAI_API_KEY --file_format markdown --book tests/test.pdf --openai_model gpt-3.5-turbo +``` +即可看到结果 +![sample_out](images/sample_image_1.png) +- yaml配置文件方式启动, 使用OpenAI模型 根据您的设置调整 `config.yaml` 文件: - ```yaml OpenAIModel: model: "gpt-3.5-turbo" - api_key: "your_openai_api_key" + api_key: "sk-xxx" GLMModel: model_url: "your_chatglm_model_url" @@ -71,29 +84,13 @@ common: book: "test/test.pdf" file_format: "markdown" ``` - -然后命令行直接运行: - -```bash -python ai_translator/main.py -``` - -![sample_out](images/sample_image_1.png) - -#### 使用命令行参数 - -您也可以直接在命令行上指定设置。这是使用 OpenAI 模型的例子: - +执行命令 ```bash -# 将您的 api_key 设置为环境变量 -export OPENAI_API_KEY="sk-xxx" -python ai_translator/main.py --model_type OpenAIModel --openai_api_key $OPENAI_API_KEY --file_format markdown --book tests/test.pdf --openai_model gpt-3.5-turbo +python ai_translator/main.py --config config.yaml --model_type OpenAIModel ``` -这是使用 GLM 模型的例子: - +- 命令行方式启动, 使用GLM模型 ```bash -# 将您的 GLM 模型 URL 设置为环境变量 export GLM_MODEL_URL="http://xxx:xx" python ai_translator/main.py --model_type GLMModel --glm_model_url $GLM_MODEL_URL --book tests/test.pdf ``` diff --git a/openai-translator/README.md b/openai-translator/README.md index 11a42c20..a41b7dc9 100644 --- a/openai-translator/README.md +++ b/openai-translator/README.md @@ -43,26 +43,38 @@ The OpenAI Translator is still in its early stages of development, and I'm activ ## Getting Started -### Environment Setup +### Quick Start -1.Clone the repository `git clone git@github.com:DjangoPeng/openai-translator.git`. - -2.The `OpenAI-Translator` requires Python 3.6 or later. Install the dependencies with `pip install -r requirements.txt`. - -3.Set up your OpenAI API key(`$OPENAI_API_KEY`) or ChatGLM Model URL(`$GLM_MODEL_URL`). You can either add it to your environment variables or specify it in the config.yaml file. - -### Usage +1. Clone the repository: +```bash +git clone git@github.com:DjangoPeng/openai-translator.git +``` -You can use OpenAI-Translator either by specifying a configuration file or by providing command-line arguments. +2. Prepare the Python environment: +- Required Python version: python > 3.10.13 + - If you don’t have a separate environment, please refer to [Install and Manage Python with Miniconda](../docs/FAQ/miniconda%E5%AE%89%E8%A3%85python.md) +- Install dependencies (make sure you have created a Python environment named myenv with Miniconda and activated it) +```bash +pip install -r requirements.txt +``` -#### Using a configuration file: +3. Start the translation program +Choose one of the following methods to start: -Adapt `config.yaml` file with your settings: +- Command-line startup (recommended), using the OpenAI model: +```bash +# Replace 'sk-xxx' with your actual OPENAI_API_KEY +export OPENAI_API_KEY="sk-xxx" +python ai_translator/main.py --model_type OpenAIModel --openai_api_key $OPENAI_API_KEY --file_format markdown --book tests/test.pdf --openai_model gpt-3.5-turbo +``` +You will see the result: +![sample_out](images/sample_image_1.png) +- YAML configuration file startup, using the OpenAI model Adjust the config.yaml file according to your settings: ```yaml OpenAIModel: model: "gpt-3.5-turbo" - api_key: "your_openai_api_key" + api_key: "sk-xxx" GLMModel: model_url: "your_chatglm_model_url" @@ -73,31 +85,18 @@ common: file_format: "markdown" ``` -Then run the tool: - +Execute the command: ```bash -python ai_translator/main.py +python ai_translator/main.py --config config.yaml --model_type OpenAIModel ``` -![sample_out](images/sample_image_1.png) - -#### Using command-line arguments: - -You can also specify the settings directly on the command line. Here's an example of how to use the OpenAI model: - +- Command-line startup, using the GLM model: ```bash -# Set your api_key as an env variable -export OPENAI_API_KEY="sk-xxx" -python ai_translator/main.py --model_type OpenAIModel --openai_api_key $OPENAI_API_KEY --file_format markdown --book tests/test.pdf --openai_model gpt-3.5-turbo +export GLM_MODEL_URL="http://xxx:xx" +python ai_translator/main.py --model_type GLMModel --glm_model_url $GLM_MODEL_URL --book tests/test.pdf ``` -And an example of how to use the GLM model: -```bash -# Set your GLM Model URL as an env variable -export GLM_MODEL_URL="http://xxx:xx" -python ai_translator/main.py --model_type GLMModel --glm_model_url $GLM_MODEL_URL --book tests/test.pdf -``` ## License diff --git a/openai-translator/requirements.txt b/openai-translator/requirements.txt index 3ad8bd4c..0016508a 100644 --- a/openai-translator/requirements.txt +++ b/openai-translator/requirements.txt @@ -1,9 +1,36 @@ -pdfplumber -simplejson -requests -PyYAML -pillow -reportlab -pandas -loguru -openai \ No newline at end of file +annotated-types==0.6.0 +anyio==4.3.0 +certifi==2024.2.2 +cffi==1.16.0 +chardet==5.2.0 +charset-normalizer==3.3.2 +cryptography==42.0.5 +distro==1.9.0 +exceptiongroup==1.2.0 +h11==0.14.0 +httpcore==1.0.4 +httpx==0.27.0 +idna==3.6 +loguru==0.7.2 +numpy==1.26.4 +openai==1.14.2 +pandas==2.2.1 +pdfminer.six==20231228 +pdfplumber==0.11.0 +pillow==10.2.0 +pycparser==2.21 +pydantic==2.6.4 +pydantic_core==2.16.3 +pypdfium2==4.28.0 +python-dateutil==2.9.0.post0 +pytz==2024.1 +PyYAML==6.0.1 +reportlab==4.1.0 +requests==2.31.0 +simplejson==3.19.2 +six==1.16.0 +sniffio==1.3.1 +tqdm==4.66.2 +typing_extensions==4.10.0 +tzdata==2024.1 +urllib3==2.2.1 \ No newline at end of file From fd8b27ba0a5d929b6cb0f2682471bcc52f65b730 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E8=AE=B8=E6=B1=9F=E6=B3=BD?= Date: Mon, 25 Mar 2024 13:48:58 +0800 Subject: [PATCH 3/6] =?UTF-8?q?hotfix:=20=E8=BF=BD=E5=8A=A0=E4=BF=AE?= =?UTF-8?q?=E5=A4=8D=E6=89=A7=E8=A1=8C=E8=BF=87=E7=A8=8B=E4=B8=AD=E5=91=BD?= =?UTF-8?q?=E4=BB=A4=E8=A1=8C=E5=8F=82=E6=95=B0=E4=B8=8D=E5=A5=BD=E7=94=A8?= =?UTF-8?q?=E9=97=AE=E9=A2=98?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- openai-translator/ai_translator/main.py | 2 ++ openai-translator/ai_translator/utils/argument_parser.py | 2 -- openai-translator/config.yaml | 4 ++-- 3 files changed, 4 insertions(+), 4 deletions(-) diff --git a/openai-translator/ai_translator/main.py b/openai-translator/ai_translator/main.py index 6b8e0c9b..7b02699b 100644 --- a/openai-translator/ai_translator/main.py +++ b/openai-translator/ai_translator/main.py @@ -18,6 +18,8 @@ api_key = args.openai_api_key if args.openai_api_key else config['OpenAIModel']['api_key'] model = OpenAIModel(model=model_name, api_key=api_key) + if args.model_type == 'OpenAIModel' and model_name and not api_key: + raise Exception("--openai_model and --openai_api_key is required when using OpenAIModel") pdf_file_path = args.book if args.book else config['common']['book'] file_format = args.file_format if args.file_format else config['common']['file_format'] diff --git a/openai-translator/ai_translator/utils/argument_parser.py b/openai-translator/ai_translator/utils/argument_parser.py index 95681dc1..ed1932cf 100644 --- a/openai-translator/ai_translator/utils/argument_parser.py +++ b/openai-translator/ai_translator/utils/argument_parser.py @@ -14,6 +14,4 @@ def __init__(self): def parse_arguments(self): args = self.parser.parse_args() - if args.model_type == 'OpenAIModel' and not args.openai_model and not args.openai_api_key: - self.parser.error("--openai_model and --openai_api_key is required when using OpenAIModel") return args diff --git a/openai-translator/config.yaml b/openai-translator/config.yaml index 2b8bc837..2114618d 100644 --- a/openai-translator/config.yaml +++ b/openai-translator/config.yaml @@ -1,9 +1,9 @@ OpenAIModel: model: "gpt-3.5-turbo" - api_key: "your_openai_api_key" + api_key: "sk-xxx" GLMModel: - model_url: "your_chatglm_model_url" + model_url: "your_chatglm_model_url like http://xxx:xx" timeout: 300 common: From fbd9b2c84cce2670528ee7961bdcdd37e06eec74 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E8=AE=B8=E6=B1=9F=E6=B3=BD?= Date: Mon, 25 Mar 2024 13:57:41 +0800 Subject: [PATCH 4/6] =?UTF-8?q?feat:=20=E8=BF=BD=E5=8A=A0FAQ?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- ...\227\256openAI\346\216\245\345\217\243.md" | 30 +++++++++++++++++++ 1 file changed, 30 insertions(+) create mode 100644 "docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" diff --git "a/docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" "b/docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" new file mode 100644 index 00000000..1110766f --- /dev/null +++ "b/docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" @@ -0,0 +1,30 @@ +# 如何访问openAI接口 + +## 解法1. 使用国内代理api +使用国内代理api即可. 将域名替换即可. api.openai.com -> api.openai-proxy.com +风险提示: 由于是第三方代理, 因此可以作为测试学习用. !!!不要作为生产用以防key被盗用!!!!!! +```python +client = OpenAI( + base_url='https://api.openai-proxy.com/v1' +) +``` + +## 解法2. 自购代理, 使用socks5代理 +第二种: 已购买代理的socks5代理 +设置为全局代理可以直接使用, 如果不是设置为全局代理, 查看你代理的端口, 诸如18080 + +先安装库 +```bash +pip install PySocks +``` + +在代码开头设置socks5代理 +```python +import socket +import socks +socks.set_default_proxy(socks.SOCKS5, "127.0.0.1", 18080) +socket.socket = socks.socksocket +``` + + + From 734594501cb2a41f6e99d589194114f1a1e707fa Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E8=AE=B8=E6=B1=9F=E6=B3=BD?= Date: Mon, 25 Mar 2024 14:02:10 +0800 Subject: [PATCH 5/6] =?UTF-8?q?feat:=20=E8=BF=BD=E5=8A=A0=E9=83=A8?= =?UTF-8?q?=E5=88=86readme.md?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- ...25\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" | 2 ++ 1 file changed, 2 insertions(+) diff --git "a/docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" "b/docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" index 1110766f..566231de 100644 --- "a/docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" +++ "b/docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" @@ -1,8 +1,10 @@ # 如何访问openAI接口 +本文只提供学习用.商用请使用正规国内代理, 代理商提供的服务 ## 解法1. 使用国内代理api 使用国内代理api即可. 将域名替换即可. api.openai.com -> api.openai-proxy.com 风险提示: 由于是第三方代理, 因此可以作为测试学习用. !!!不要作为生产用以防key被盗用!!!!!! + ```python client = OpenAI( base_url='https://api.openai-proxy.com/v1' From db42f488bd0d87a279984714ee8235b49a963410 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E8=AE=B8=E6=B1=9F=E6=B3=BD?= Date: Tue, 26 Mar 2024 11:02:11 +0800 Subject: [PATCH 6/6] =?UTF-8?q?feat:=20=E8=BF=BD=E5=8A=A0=E9=83=A8?= =?UTF-8?q?=E5=88=86readme.md?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- openai-translator/README-CN.md | 5 +++++ openai-translator/README.md | 2 ++ 2 files changed, 7 insertions(+) diff --git a/openai-translator/README-CN.md b/openai-translator/README-CN.md index 693c2559..c88b4e0c 100644 --- a/openai-translator/README-CN.md +++ b/openai-translator/README-CN.md @@ -66,6 +66,9 @@ pip install -r requirements.txt export OPENAI_API_KEY="sk-xxx" python ai_translator/main.py --model_type OpenAIModel --openai_api_key $OPENAI_API_KEY --file_format markdown --book tests/test.pdf --openai_model gpt-3.5-turbo ``` + +ps: windows 系统请使用 `set` 命令替换 `export` 命令 + 即可看到结果 ![sample_out](images/sample_image_1.png) @@ -95,6 +98,8 @@ export GLM_MODEL_URL="http://xxx:xx" python ai_translator/main.py --model_type GLMModel --glm_model_url $GLM_MODEL_URL --book tests/test.pdf ``` +ps: windows 系统请使用 `set` 命令替换 `export` 命令 + ## 许可证 该项目采用 GPL-3.0 许可证。有关详细信息,请查看 [LICENSE](LICENSE) 文件。 diff --git a/openai-translator/README.md b/openai-translator/README.md index a41b7dc9..dbacc9d0 100644 --- a/openai-translator/README.md +++ b/openai-translator/README.md @@ -62,6 +62,7 @@ pip install -r requirements.txt Choose one of the following methods to start: - Command-line startup (recommended), using the OpenAI model: +ps: For Windows system, please use the set command instead of the export command ```bash # Replace 'sk-xxx' with your actual OPENAI_API_KEY export OPENAI_API_KEY="sk-xxx" @@ -91,6 +92,7 @@ python ai_translator/main.py --config config.yaml --model_type OpenAIModel ``` - Command-line startup, using the GLM model: +ps: For Windows system, please use the set command instead of the export command ```bash export GLM_MODEL_URL="http://xxx:xx" python ai_translator/main.py --model_type GLMModel --glm_model_url $GLM_MODEL_URL --book tests/test.pdf