|
| 1 | +{ |
| 2 | + "cells": [ |
| 3 | + { |
| 4 | + "cell_type": "markdown", |
| 5 | + "id": "6d251ced", |
| 6 | + "metadata": {}, |
| 7 | + "source": [ |
| 8 | + "# How to use LangChain and Azure OpenAI with Python\n", |
| 9 | + "\n", |
| 10 | + "\n", |
| 11 | + "Langchain is an open source framework for developing applications using large language models (LLM). <br>\n", |
| 12 | + "\n", |
| 13 | + "This guide will demonstrate how to setup and use Azure OpenAI models' API with LangChain.\n", |
| 14 | + " " |
| 15 | + ] |
| 16 | + }, |
| 17 | + { |
| 18 | + "cell_type": "markdown", |
| 19 | + "id": "9d0ee335", |
| 20 | + "metadata": {}, |
| 21 | + "source": [ |
| 22 | + "## Set Up\n", |
| 23 | + "The following libraries must be installed to use LangChain with Azure OpenAI.<br>" |
| 24 | + ] |
| 25 | + }, |
| 26 | + { |
| 27 | + "cell_type": "code", |
| 28 | + "execution_count": 16, |
| 29 | + "id": "35289cea", |
| 30 | + "metadata": {}, |
| 31 | + "outputs": [], |
| 32 | + "source": [ |
| 33 | + "#%pip install --upgrade openai\n", |
| 34 | + "#%pip install langchain" |
| 35 | + ] |
| 36 | + }, |
| 37 | + { |
| 38 | + "cell_type": "markdown", |
| 39 | + "id": "ba880453", |
| 40 | + "metadata": {}, |
| 41 | + "source": [ |
| 42 | + "## API Configuation\n", |
| 43 | + "\n", |
| 44 | + "After installing the necessary libraies, the API must be configured. The code below shows how to configure the API directly in your Python environment. \n" |
| 45 | + ] |
| 46 | + }, |
| 47 | + { |
| 48 | + "cell_type": "code", |
| 49 | + "execution_count": 25, |
| 50 | + "id": "a9752fda", |
| 51 | + "metadata": { |
| 52 | + "scrolled": true |
| 53 | + }, |
| 54 | + "outputs": [], |
| 55 | + "source": [ |
| 56 | + "from langchain.llms import AzureOpenAI \n", |
| 57 | + "import openai\n", |
| 58 | + "import json\n", |
| 59 | + "import os\n", |
| 60 | + "\n", |
| 61 | + "# Load config values\n", |
| 62 | + "with open(r'config.json') as config_file:\n", |
| 63 | + " config_details = json.load(config_file)\n", |
| 64 | + "\n", |
| 65 | + "# Setting up the deployment name\n", |
| 66 | + "deployment_name = config_details['DEPLOYMENT_NAME']\n", |
| 67 | + "\n", |
| 68 | + "# This is set to `azure`\n", |
| 69 | + "openai.api_type = \"azure\"\n", |
| 70 | + "\n", |
| 71 | + "# The API key for your Azure OpenAI resource.\n", |
| 72 | + "openai.api_key = os.getenv(\"OPENAI_API_KEY\")\n", |
| 73 | + "\n", |
| 74 | + "# The base URL for your Azure OpenAI resource. e.g. \"https://<your resource name>.openai.azure.com\"\n", |
| 75 | + "openai.api_base = config_details['OPENAI_API_BASE']\n", |
| 76 | + "\n", |
| 77 | + "# Currently Chat Completion API have the following versions available: 2023-07-01-preview\n", |
| 78 | + "openai.api_version = config_details['OPENAI_API_VERSION']" |
| 79 | + ] |
| 80 | + }, |
| 81 | + { |
| 82 | + "cell_type": "markdown", |
| 83 | + "id": "ab4779e5", |
| 84 | + "metadata": {}, |
| 85 | + "source": [ |
| 86 | + "## Deployed Model Setup\n", |
| 87 | + "\n", |
| 88 | + "Azure OpenAI allows users to create and manage their own model deployments. To call the API, you must specify the deployment you want to use by passing in the deployment name and model name. " |
| 89 | + ] |
| 90 | + }, |
| 91 | + { |
| 92 | + "cell_type": "code", |
| 93 | + "execution_count": 33, |
| 94 | + "id": "6ea3e99f", |
| 95 | + "metadata": {}, |
| 96 | + "outputs": [ |
| 97 | + { |
| 98 | + "data": { |
| 99 | + "text/plain": [ |
| 100 | + "\"\\n\\n\\n\\n\\nThe world is a beautiful place,\\nThe colors are so bright and true,\\nAnd I feel so free and free,\\nWhen I'm away from here.\\n\\nThe sky is so blue,\\nAnd the sun is so warm,\\nAnd I feel so free and free,\\nWhen I'm away from here.\"" |
| 101 | + ] |
| 102 | + }, |
| 103 | + "execution_count": 33, |
| 104 | + "metadata": {}, |
| 105 | + "output_type": "execute_result" |
| 106 | + } |
| 107 | + ], |
| 108 | + "source": [ |
| 109 | + "# Create an instance of Azure OpenAI\n", |
| 110 | + "\n", |
| 111 | + "# Replace the deployment name and model name with your own\n", |
| 112 | + "llm = AzureOpenAI(\n", |
| 113 | + " deployment_name= deployment_name,\n", |
| 114 | + " model_name=\"text-davinci-002\", \n", |
| 115 | + ")\n", |
| 116 | + "\n", |
| 117 | + "# Run the LLM\n", |
| 118 | + "llm(\"Write me a poem\")" |
| 119 | + ] |
| 120 | + }, |
| 121 | + { |
| 122 | + "cell_type": "markdown", |
| 123 | + "id": "dc7ea2d4", |
| 124 | + "metadata": {}, |
| 125 | + "source": [ |
| 126 | + "## PromptTemplates\n", |
| 127 | + "\n", |
| 128 | + "Langchain provides a built in PromptsTemplate module to simplify the construction of prompts to get more specific answers." |
| 129 | + ] |
| 130 | + }, |
| 131 | + { |
| 132 | + "cell_type": "code", |
| 133 | + "execution_count": 40, |
| 134 | + "id": "927d4bac", |
| 135 | + "metadata": {}, |
| 136 | + "outputs": [ |
| 137 | + { |
| 138 | + "name": "stdout", |
| 139 | + "output_type": "stream", |
| 140 | + "text": [ |
| 141 | + "\n", |
| 142 | + "There are a number of good face washes that can help with acne prone skin. A few popular options include the Neutrogena Oil-Free Acne Wash, the Cetaphil Dermatological Gentle Skin Cleanser, and the La Roche-Posay Effaclar Medicated Gel Cleanser.\n" |
| 143 | + ] |
| 144 | + } |
| 145 | + ], |
| 146 | + "source": [ |
| 147 | + "from langchain import PromptTemplate\n", |
| 148 | + "\n", |
| 149 | + "template = \"\"\"\n", |
| 150 | + "You are a skin care consulant that recommends products based on customer\n", |
| 151 | + "needs and preferences.\n", |
| 152 | + "\n", |
| 153 | + "What is a good {product_type} to help with {customer_request}?\n", |
| 154 | + "\"\"\"\n", |
| 155 | + "\n", |
| 156 | + "prompt = PromptTemplate(\n", |
| 157 | + "input_variables=[\"product_type\", \"customer_request\"],\n", |
| 158 | + "template=template,\n", |
| 159 | + ")\n", |
| 160 | + "\n", |
| 161 | + "print(llm(\n", |
| 162 | + " prompt.format(\n", |
| 163 | + " product_type=\"face wash\",\n", |
| 164 | + " customer_request = \"acne prone skin\"\n", |
| 165 | + " )\n", |
| 166 | + "))" |
| 167 | + ] |
| 168 | + }, |
| 169 | + { |
| 170 | + "cell_type": "markdown", |
| 171 | + "id": "7b6723eb", |
| 172 | + "metadata": {}, |
| 173 | + "source": [ |
| 174 | + "## Chains\n", |
| 175 | + "There are many applications of chains that allow you to combine numerous LLM calls and actions. <br>\n", |
| 176 | + "\n", |
| 177 | + "### Simple Sequential Chains <br>\n", |
| 178 | + "Allow you to feed the output of one LLM Chain as input for another." |
| 179 | + ] |
| 180 | + }, |
| 181 | + { |
| 182 | + "cell_type": "code", |
| 183 | + "execution_count": 46, |
| 184 | + "id": "af7c236f", |
| 185 | + "metadata": {}, |
| 186 | + "outputs": [], |
| 187 | + "source": [ |
| 188 | + "#from langchain.llms import AzureOpenAI\n", |
| 189 | + "from langchain.chains import LLMChain\n", |
| 190 | + "# from langchain.prompts import PromptTemplate\n", |
| 191 | + "from langchain.chains import SimpleSequentialChain" |
| 192 | + ] |
| 193 | + }, |
| 194 | + { |
| 195 | + "cell_type": "code", |
| 196 | + "execution_count": 51, |
| 197 | + "id": "2a4a32f0", |
| 198 | + "metadata": {}, |
| 199 | + "outputs": [], |
| 200 | + "source": [ |
| 201 | + "template = \"\"\"Your job is to come up with a fun DIY project for the specified gender, age, and description of a kid.\n", |
| 202 | + "% CHILD_DESCRIPTION\n", |
| 203 | + "{child_description}\n", |
| 204 | + "\n", |
| 205 | + "YOUR RESPONSE:\n", |
| 206 | + "\"\"\"\n", |
| 207 | + "prompt_template = PromptTemplate(input_variables=[\"child_description\"], template=template)\n", |
| 208 | + "\n", |
| 209 | + "# Holds my 'location' chain\n", |
| 210 | + "description_chain = LLMChain(llm=llm, prompt=prompt_template)" |
| 211 | + ] |
| 212 | + }, |
| 213 | + { |
| 214 | + "cell_type": "code", |
| 215 | + "execution_count": 55, |
| 216 | + "id": "6eec47ff", |
| 217 | + "metadata": {}, |
| 218 | + "outputs": [], |
| 219 | + "source": [ |
| 220 | + "template = \"\"\"Given a DIY project, give a short and simple recipe step-by-step guide on how to complete the project and a materials list.\n", |
| 221 | + "% DIY_PROJECT\n", |
| 222 | + "{diy_project}\n", |
| 223 | + "\n", |
| 224 | + "YOUR RESPONSE:\n", |
| 225 | + "\"\"\"\n", |
| 226 | + "prompt_template = PromptTemplate(input_variables=[\"diy_project\"], template=template)\n", |
| 227 | + "\n", |
| 228 | + "# Holds my 'meal' chain\n", |
| 229 | + "diy_chain = LLMChain(llm=llm, prompt=prompt_template)" |
| 230 | + ] |
| 231 | + }, |
| 232 | + { |
| 233 | + "cell_type": "code", |
| 234 | + "execution_count": 56, |
| 235 | + "id": "84a15aea", |
| 236 | + "metadata": {}, |
| 237 | + "outputs": [], |
| 238 | + "source": [ |
| 239 | + "overall_chain = SimpleSequentialChain(chains=[description_chain, diy_chain], verbose=True)" |
| 240 | + ] |
| 241 | + }, |
| 242 | + { |
| 243 | + "cell_type": "code", |
| 244 | + "execution_count": 57, |
| 245 | + "id": "15928f72", |
| 246 | + "metadata": {}, |
| 247 | + "outputs": [ |
| 248 | + { |
| 249 | + "name": "stdout", |
| 250 | + "output_type": "stream", |
| 251 | + "text": [ |
| 252 | + "\n", |
| 253 | + "\n", |
| 254 | + "\u001b[1m> Entering new SimpleSequentialChain chain...\u001b[0m\n" |
| 255 | + ] |
| 256 | + }, |
| 257 | + { |
| 258 | + "name": "stdout", |
| 259 | + "output_type": "stream", |
| 260 | + "text": [ |
| 261 | + "\u001b[36;1m\u001b[1;3m\n", |
| 262 | + "A simple and fun DIY project for a 5-year-old girl is to make a paper doll. All you need is some paper, scissors, and crayons or markers. First, cut out a paper doll shape from the paper. Then, decorate the paper doll with clothes, hair, and facial features. You can also cut out paper furniture and accessories to create a paper doll scene.\u001b[0m\n", |
| 263 | + "\u001b[33;1m\u001b[1;3m\n", |
| 264 | + "1. Cut out a paper doll shape from the paper.\n", |
| 265 | + "2. Decorate the paper doll with clothes, hair, and facial features.\n", |
| 266 | + "3. Cut out paper furniture and accessories to create a paper doll scene.\n", |
| 267 | + "\n", |
| 268 | + "Materials needed:\n", |
| 269 | + "\n", |
| 270 | + "-Paper\n", |
| 271 | + "-Scissors\n", |
| 272 | + "-Crayons or markers\u001b[0m\n", |
| 273 | + "\n", |
| 274 | + "\u001b[1m> Finished chain.\u001b[0m\n" |
| 275 | + ] |
| 276 | + } |
| 277 | + ], |
| 278 | + "source": [ |
| 279 | + "review = overall_chain.run(\"5-year-old girl\")" |
| 280 | + ] |
| 281 | + } |
| 282 | + ], |
| 283 | + "metadata": { |
| 284 | + "kernelspec": { |
| 285 | + "display_name": "Python 3 (ipykernel)", |
| 286 | + "language": "python", |
| 287 | + "name": "python3" |
| 288 | + }, |
| 289 | + "language_info": { |
| 290 | + "codemirror_mode": { |
| 291 | + "name": "ipython", |
| 292 | + "version": 3 |
| 293 | + }, |
| 294 | + "file_extension": ".py", |
| 295 | + "mimetype": "text/x-python", |
| 296 | + "name": "python", |
| 297 | + "nbconvert_exporter": "python", |
| 298 | + "pygments_lexer": "ipython3", |
| 299 | + "version": "3.11.4" |
| 300 | + } |
| 301 | + }, |
| 302 | + "nbformat": 4, |
| 303 | + "nbformat_minor": 5 |
| 304 | +} |
0 commit comments