Skip to content

Commit 2ce9b33

Browse files
committed
WIP
1 parent fc80eb0 commit 2ce9b33

File tree

9 files changed

+54
-54
lines changed

9 files changed

+54
-54
lines changed

asciidoc/courses/llm-chatbot-typescript/modules/1-project-setup/lessons/2-setup/lesson.adoc

Lines changed: 18 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -84,17 +84,30 @@ NEO4J_USERNAME="{sandbox_username}"
8484
NEO4J_PASSWORD="{sandbox_password}"
8585
----
8686

87-
You must also set your OpenAI API key.
88-
You can find link:https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key[instructions on generating an API key on openai.com^].
87+
You will also need an API Key to generate LLM responses using OpenAI.
88+
89+
We have generated an OpenAI API key for you to use through our OpenAI Proxy for the duration of this course using a proxy.
90+
The API key will be limited to 5 requests every two minutes.
91+
8992

9093
Replace the `OPENAI_API_KEY` value with your OpenAI API key.
9194

9295
.env.local
93-
[source,env]
96+
[source,env,subs="attributes+"]
9497
----
95-
OPENAI_API_KEY="sk-..."
98+
OPENAI_API_KEY="{llm-api-key}"
99+
OPENAI_API_BASE="{llm-api-base}"
96100
----
97101

102+
If you find the rate limits restricting, you can always use an OpenAI API Key.
103+
In this case, you would omit the Base URL.
104+
You can find link:https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key[instructions on generating an API key on openai.com^].
105+
106+
This course's challenges do not require you use OpenAI LLMs, so feel free to use an alternative LLM provider.
107+
We also like link:https://ollama.com/[Ollama^].
108+
109+
110+
98111
[WARNING]
99112
.Keep your secrets safe
100113
====
@@ -151,6 +164,6 @@ include::questions/1-server.adoc[leveloffset=+1]
151164
[.summary]
152165
== Summary
153166

154-
In this lesson, you obtained a copy of the course code, installed the dependency and used the `npm run dev` command to start the app.
167+
In this lesson, you obtained a copy of the course code, installed the dependency, and used the `npm run dev` command to start the app.
155168

156169
In the next lesson, you will set the scope for the project.
52.1 KB
Loading

asciidoc/courses/llm-chatbot-typescript/modules/2-chains/lessons/1-introduction-to-lcel/lesson.adoc

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,8 @@ You can link:https://js.langchain.com/docs/expression_language/[read more about
2626
In the link:https://graphacademy.neo4j.com/courses/llm-fundamentals/3-intro-to-langchain/2.5-chains/[Chains lesson in Neo4j & LLM Fundamentals^], you learned about the `LLMChain`.
2727
The link:https://api.python.langchain.com/en/latest/_modules/langchain/chains/llm.html#LLMChain[`LLMChain`^] is an example of a simple chain that, when invoked, takes a user input, replaces the value inside the prompt and passes the prompt to an LLM and specifies the result.
2828

29+
image::./images/chain.png[Prompt → LLM → Response]
30+
2931
// [source]
3032
// ----
3133
// // TODO: Diagram
@@ -60,6 +62,16 @@ The prompt will be passed to an LLM, in this case, the `ChatOpenAI` model.
6062
include::{repository-raw}/main/examples/chain.mjs[tag=llm]
6163
----
6264

65+
=== The Response Parser
66+
67+
You will then pass the LLM's response to an output parser.
68+
The most straightforward parser is a `StringOutputParser`.
69+
70+
[source,typescript]
71+
----
72+
include::{repository-raw}/main/examples/chain.mjs[tag=parser]
73+
----
74+
6375

6476
=== Creating the Chain
6577

asciidoc/courses/llm-chatbot-typescript/modules/6-agent/lessons/6-limiting-scope/lesson.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ The agent currently uses a pre-written prompt from LangChain Hub.
1919
You can link:https://smith.langchain.com/hub/hwchase17/openai-functions-agent[view this prompt in LangChain Hub^].
2020

2121
The prompt is relatively basic.
22-
It consists of an array of messages consisting of a role definition, human input, and placeholders.
22+
It consists of an array of messages, including a role definition, human input, and placeholders.
2323

2424
[source]
2525
.The prompt
@@ -47,7 +47,7 @@ For example, the following prompt instructs the LLM to refuse to answer question
4747
.Modified Prompt
4848
[source]
4949
----
50-
include::{repository-raw}/main/src/{lab-solution}[tag="scoped", indent=0]
50+
include::{repository-raw}/main/{lab-solution}[tag="scoped", indent=0]
5151
----
5252

5353
[NOTE]
@@ -65,7 +65,7 @@ The link:{repository-blob}/main/{test-file}[unit test^] will verify whether the
6565
== Experiment
6666

6767
You can extend the prompt to instruct the agent to act in specific ways.
68-
Experiment with different prompts to see how the agent responds to various types of questions.
68+
Experiment with different prompts to see how the agent responds to various questions.
6969

7070
When you are ready, click Continue.
7171

asciidoc/courses/llm-fundamentals/modules/3-intro-to-langchain/lessons/2-initialising-the-llm/lesson.adoc

Lines changed: 2 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -20,37 +20,7 @@ This course includes instructions for using link:https://openai.com/[OpenAI^], b
2020

2121
[NOTE]
2222
.Using OpenAI
23-
====
24-
We have generated an OpenAI API key for you to use through our OpenAI Proxy for the duration of this course using a proxy.
25-
The API key will be limited to 5 requests every two minutes.
26-
27-
28-
.Environment Variable
29-
[source,env,subs="attributes+"]
30-
----
31-
OPENAI_API_KEY={llm-api-key}
32-
----
33-
34-
You must also set the `base_url` parameter to use our proxy server.
35-
36-
.Setting the Proxy
37-
[source,python,subs="attributes+"]
38-
----
39-
from openai import OpenAI
40-
41-
model = OpenAI(
42-
api_key="{llm-api-key}",
43-
base_url="https://graphacademy.neo4j.com/api/llm/v1/"
44-
)
45-
----
46-
47-
You can always use an existing OpenAI API key and omit the `base_url` argument.
48-
49-
50-
====
51-
52-
53-
23+
If you wish to use OpenAI and follow this course's practical activities, you must create an account and set up billing.
5424

5525
== Setup
5626

@@ -84,7 +54,7 @@ pip install openai langchain-openai
8454

8555
== Create a Langchain application
8656

87-
Create a new Python program and copy this code into a new Python file.
57+
Create a new Python program and copy this code into it.
8858

8959
[source,python]
9060
----

asciidoc/courses/llm-fundamentals/modules/3-intro-to-langchain/lessons/2.5-chains/code/llm_chain.py

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,9 @@
22
from langchain.prompts import PromptTemplate
33
from langchain.chains import LLMChain
44

5-
llm = OpenAI(openai_api_key="sk-...")
5+
llm = OpenAI(
6+
openai_api_key="sk-..."
7+
)
68

79
template = PromptTemplate.from_template("""
810
You are a cockney fruit and vegetable seller.
@@ -13,8 +15,8 @@
1315
""")
1416

1517
llm_chain = LLMChain(
16-
llm=llm,
17-
prompt=template
18+
llm=llm,
19+
prompt=template
1820
)
1921

2022
response = llm_chain.invoke({"fruit": "apple"})

asciidoc/courses/llm-fundamentals/modules/3-intro-to-langchain/lessons/2.5-chains/code/llm_chain_output.py

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,8 @@
44
from langchain.schema import StrOutputParser
55

66
llm = OpenAI(
7-
openai_api_key="sk-...")
7+
openai_api_key="sk-..."
8+
)
89

910
template = PromptTemplate.from_template("""
1011
You are a cockney fruit and vegetable seller.
@@ -15,9 +16,9 @@
1516
""")
1617

1718
llm_chain = LLMChain(
18-
llm=llm,
19-
prompt=template,
20-
output_parser=StrOutputParser()
19+
llm=llm,
20+
prompt=template,
21+
output_parser=StrOutputParser()
2122
)
2223

2324
response = llm_chain.invoke({"fruit": "apple"})

asciidoc/courses/llm-fundamentals/modules/3-intro-to-langchain/lessons/2.5-chains/code/llm_chain_output_json.py

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,8 @@
44
from langchain.output_parsers.json import SimpleJsonOutputParser
55

66
llm = OpenAI(
7-
openai_api_key="sk-...")
7+
openai_api_key="sk-..."
8+
)
89

910
template = PromptTemplate.from_template("""
1011
You are a cockney fruit and vegetable seller.
@@ -17,9 +18,9 @@
1718
""")
1819

1920
llm_chain = LLMChain(
20-
llm=llm,
21-
prompt=template,
22-
output_parser=SimpleJsonOutputParser()
21+
llm=llm,
22+
prompt=template,
23+
output_parser=SimpleJsonOutputParser()
2324
)
2425

2526
response = llm_chain.invoke({"fruit": "apple"})

asciidoc/courses/llm-fundamentals/modules/3-intro-to-langchain/lessons/2.5-chains/lesson.adoc

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,22 +8,23 @@ Chains allows you to combine language models with different data sources and thi
88

99
== Using LLMChain
1010

11-
The simplest chain is an `LLMChain`. An `LLMChain` combines a prompt template with an LLM and returns a response.
11+
The most straightforward chain is an `LLMChain`. An `LLMChain` combines a prompt template with an LLM and returns a response.
1212

1313
Previously, you created a program that used a prompt template and an LLM to generate a response about fruit.
1414

15+
1516
[%collapsible]
1617
.Click to reveal the code for the program.
1718
====
18-
[source,python]
19+
[source,python,subs="attributes+"]
1920
----
2021
include::../2-initialising-the-llm/code/llm_prompt.py[]
2122
----
2223
====
2324

2425
You can combine this program into a chain and create a reusable component.
2526

26-
[source,python]
27+
[source,python,subs="attributes+"]
2728
----
2829
include::code/llm_chain.py[]
2930
----

0 commit comments

Comments
 (0)