Skip to content

Commit c3722c4

Browse files
committed
Replace branch attribute
1 parent 2391f71 commit c3722c4

File tree

18 files changed

+117
-117
lines changed

18 files changed

+117
-117
lines changed

asciidoc/courses/llm-chatbot-typescript/includes/test.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,6 @@ npm run test {test-filename}
1616
[source,typescript]
1717
.{test-filename}
1818
----
19-
include::{repository-raw}/{branch}/{test-file}[]
19+
include::{repository-raw}/main/{test-file}[]
2020
----
2121
====

asciidoc/courses/llm-chatbot-typescript/modules/2-chains/lessons/1-introduction-to-lcel/lesson.adoc

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ You can use the static `fromTemplate()` method to construct a new `PromptTemplat
4848

4949
[source,typescript]
5050
----
51-
include::{repository-raw}/{branch}/examples/chain.mjs[tag=prompt]
51+
include::{repository-raw}/main/examples/chain.mjs[tag=prompt]
5252
----
5353

5454
=== The LLM
@@ -57,7 +57,7 @@ The prompt will be passed to an LLM, in this case, the `ChatOpenAI` model.
5757

5858
[source,typescript]
5959
----
60-
include::{repository-raw}/{branch}/examples/chain.mjs[tag=llm]
60+
include::{repository-raw}/main/examples/chain.mjs[tag=llm]
6161
----
6262

6363

@@ -68,7 +68,7 @@ To create a new chain, call the `RunnableSequence.from` method, passing through
6868

6969
[source,typescript]
7070
----
71-
include::{repository-raw}/{branch}/examples/chain.mjs[tag=chain]
71+
include::{repository-raw}/main/examples/chain.mjs[tag=chain]
7272
----
7373

7474

@@ -82,7 +82,7 @@ Because the prompt expects `{fruit}` as an input, you call the `.invoke()` metho
8282

8383
[source,typescript]
8484
----
85-
include::{repository-raw}/{branch}/examples/chain.mjs[tag=invoke, indent=0]
85+
include::{repository-raw}/main/examples/chain.mjs[tag=invoke, indent=0]
8686
----
8787

8888

@@ -94,7 +94,7 @@ You can ensure type safety in your chains by defining input and output types on
9494
9595
[source,typescript]
9696
----
97-
include::{repository-raw}/{branch}/examples/chain.mjs[tag=types]
97+
include::{repository-raw}/main/examples/chain.mjs[tag=types]
9898
----
9999
====
100100

asciidoc/courses/llm-chatbot-typescript/modules/2-chains/lessons/2-answer-generation/lesson.adoc

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ The chain will accept the following input:
2020
.Chain Input
2121
[source,typescript]
2222
----
23-
include::{repository-raw}/{branch}/{lab-solution}[tag=interface]
23+
include::{repository-raw}/main/{lab-solution}[tag=interface]
2424
----
2525

2626
The chain will need to:
@@ -51,7 +51,7 @@ Use the following prompt as the first parameter.
5151
.Prompt
5252
[source]
5353
----
54-
include::{repository-raw}/{branch}/prompts/{prompt-filename}[]
54+
include::{repository-raw}/main/prompts/{prompt-filename}[]
5555
----
5656

5757
Due to the nature of semantic search, the documents passed by the application to this prompt may not fully answer the question.
@@ -64,7 +64,7 @@ Your code should resemble the following:
6464
.Prompt Template
6565
[source,typescript,indent=0]
6666
----
67-
include::{repository-raw}/{branch}/{lab-solution}[tag=prompt, indent=0]
67+
include::{repository-raw}/main/{lab-solution}[tag=prompt, indent=0]
6868
----
6969

7070
== Create the Runnable Sequence
@@ -74,7 +74,7 @@ The chain should pass the prompt to the LLM passed as a parameter, then format t
7474

7575
[source,typescript]
7676
----
77-
include::{repository-raw}/{branch}/{lab-solution}[tag=sequence, indent=0]
77+
include::{repository-raw}/main/{lab-solution}[tag=sequence, indent=0]
7878
----
7979

8080
Use the `return` keyword to return the chain from the function.
@@ -86,7 +86,7 @@ Use the `return` keyword to return the chain from the function.
8686
====
8787
[source,js,indent=0]
8888
----
89-
include::{repository-raw}/{branch}/{lab-solution}[tag=function]
89+
include::{repository-raw}/main/{lab-solution}[tag=function]
9090
----
9191
====
9292

@@ -96,7 +96,7 @@ You will be able to initialize and run the chain in your application with the fo
9696

9797
[source,typescript]
9898
----
99-
include::{repository-raw}/{branch}/{lab-solution}[tag=usage]
99+
include::{repository-raw}/main/{lab-solution}[tag=usage]
100100
----
101101

102102

asciidoc/courses/llm-chatbot-typescript/modules/2-chains/lessons/3-evaluating-responses/lesson.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
:order: 3
66

77

8-
You may have noticed in the previous lesson that a you ran link:{repository-raw}/{branch}/{lab-filename}[a unit test used that validated the interaction with the LLM^].
8+
You may have noticed in the previous lesson that a you ran link:{repository-raw}/main/{lab-filename}[a unit test used that validated the interaction with the LLM^].
99

1010
Unit tests are not only a convenient way to test individual elements of an application.
1111
They also provide an automated way to test the response.
@@ -22,15 +22,15 @@ Before running any tests, the `beforeAll()` function creates an instance of the
2222
.Answer Evaluation Chain
2323
[source,typescript]
2424
----
25-
include::{repository-raw}/{branch}/{lab-filename}[tag=evalchain,indent=0]
25+
include::{repository-raw}/main/{lab-filename}[tag=evalchain,indent=0]
2626
----
2727

2828
Each test in the suite uses this chain to evaluate the answer provided in the test.
2929

3030
.Evaluating an answer
3131
[source,typescript]
3232
----
33-
include::{repository-raw}/{branch}/{lab-filename}[tag=eval,indent=0]
33+
include::{repository-raw}/main/{lab-filename}[tag=eval,indent=0]
3434
----
3535

3636
The test uses a concatenation of the output of the evaluation chain and the original response, so if the evaluation does not return a _yes_, the response is appended to the test output to help debug the issue.

asciidoc/courses/llm-chatbot-typescript/modules/3-conversation-history/lessons/2-conversation-model/lesson.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ image::images/retrieval-model.png[The Retrieval Conversation History Data Model]
4545
// [source,cypher]
4646
// .Save Conversation History
4747
// ----
48-
// include::{repository-raw}/{branch}/cypher/save-response.cypher[]
48+
// include::{repository-raw}/main/cypher/save-response.cypher[]
4949
// ----
5050
// ====
5151

@@ -79,7 +79,7 @@ Note also the addition of the `.cypher` property on the `(: Response)`, which wi
7979
// // TODO: Combine these - save-response.cypher
8080
// [source,cypher]
8181
// ----
82-
// include::{repository-raw}/{branch}/cypher/save-response.cypher[]
82+
// include::{repository-raw}/main/cypher/save-response.cypher[]
8383
// ----
8484
// ====
8585

@@ -131,7 +131,7 @@ You will use the following Cypher statement to save conversation history to the
131131

132132
[source,cypher]
133133
----
134-
include::{repository-raw}/{branch}/cypher/save-response.cypher[]
134+
include::{repository-raw}/main/cypher/save-response.cypher[]
135135
----
136136

137137
The statement performs the following actions:

asciidoc/courses/llm-chatbot-typescript/modules/3-conversation-history/lessons/2-neo4j-graph/lesson.adoc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ In `{lab-filename}`, find the `initGraph()` function.
2727

2828
[source,typescript]
2929
----
30-
include::{repository-raw}/{branch}/{lab-folder}/{lab-filename}[tag=graph]
30+
include::{repository-raw}/main/{lab-folder}/{lab-filename}[tag=graph]
3131
----
3232

3333
The code (1) defines a variable for storing the `Neo4jGraph` and (2) implements a function to either return an existing `Neo4jGraph` object or create a new one if it doesn't exist, employing the singleton pattern.
@@ -37,7 +37,7 @@ Inside the `if` statement, call the `Neo4jGraph.initialize()` method, passing th
3737
.Create the Neo4j Graph instance
3838
[source,typescript]
3939
----
40-
include::{repository-raw}/{branch}/{lab-solution}[tag=create, indent=0]
40+
include::{repository-raw}/main/{lab-solution}[tag=create, indent=0]
4141
----
4242

4343
The `Neo4jGraph.initialize()` method will create a new `Neo4jGraph` instance and verify connectivity to the database.
@@ -47,7 +47,7 @@ Verifying connectivity ensures that the Neo4j credentials are correct and throws
4747
.Return the singleton
4848
[source,typescript]
4949
----
50-
include::{repository-raw}/{branch}/{lab-solution}[tag=return, indent=0]
50+
include::{repository-raw}/main/{lab-solution}[tag=return, indent=0]
5151
----
5252

5353

@@ -56,7 +56,7 @@ If you have followed the instructions correctly, your `initGraph` function shoul
5656
.initGraph
5757
[source,typescript]
5858
----
59-
include::{repository-raw}/{branch}/{lab-solution}[tag=graph, indent=0]
59+
include::{repository-raw}/main/{lab-solution}[tag=graph, indent=0]
6060
----
6161

6262

asciidoc/courses/llm-chatbot-typescript/modules/3-conversation-history/lessons/3-persisting-responses/lesson.adoc

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ To learn how to integrate Neo4j into a TypeScript project, link:/courses/app-typ
4343

4444
[source,typescript]
4545
----
46-
include::{repository-raw}/{branch}/src/solutions/modules/graph.ts[]
46+
include::{repository-raw}/main/src/solutions/modules/graph.ts[]
4747
----
4848
=====
4949
@@ -57,7 +57,7 @@ To save the history, modify the `saveHistory()` function.
5757
.saveHistory() Signature
5858
[source,typescript]
5959
----
60-
include::{repository-raw}/{branch}/src/modules/agent/history.ts[tag=save]
60+
include::{repository-raw}/main/src/modules/agent/history.ts[tag=save]
6161
----
6262

6363
// TODO: switch to Neo4jGraph when the extra parameter is added
@@ -70,15 +70,15 @@ The statement should run in a write transaction.
7070
.Save Conversation History
7171
[source,cypher]
7272
----
73-
include::{repository-raw}/{branch}/cypher/save-response.cypher[]
73+
include::{repository-raw}/main/cypher/save-response.cypher[]
7474
----
7575

7676
Your code should resemble the following:
7777

7878
.Save History
7979
[source,typescript,indent=0]
8080
----
81-
include::{repository-raw}/{branch}/src/solutions/modules/agent/history.ts[tag=savetx]
81+
include::{repository-raw}/main/src/solutions/modules/agent/history.ts[tag=savetx]
8282
----
8383

8484

@@ -87,7 +87,7 @@ Finally, use the `id` key from the first object in the `res` array to return the
8787
.Return the Response ID
8888
[source,typescript,indent=0]
8989
----
90-
include::{repository-raw}/{branch}/src/solutions/modules/agent/history.ts[tag=savereturn]
90+
include::{repository-raw}/main/src/solutions/modules/agent/history.ts[tag=savereturn]
9191
----
9292

9393

@@ -98,7 +98,7 @@ include::{repository-raw}/{branch}/src/solutions/modules/agent/history.ts[tag=sa
9898
.The implemented saveHistory() Function
9999
[source,typescript]
100100
----
101-
include::{repository-raw}/{branch}/src/solutions/modules/agent/history.ts[tag=save]
101+
include::{repository-raw}/main/src/solutions/modules/agent/history.ts[tag=save]
102102
----
103103
104104
====
@@ -112,7 +112,7 @@ To retrieve the history saved in the previous function, you must modify the `get
112112
.getHistory() Signature
113113
[source,typescript]
114114
----
115-
include::{repository-raw}/{branch}/src/modules/agent/history.ts[tag=get]
115+
include::{repository-raw}/main/src/modules/agent/history.ts[tag=get]
116116
----
117117

118118
// TODO: switch to Neo4jGraph when the extra parameter is added
@@ -124,7 +124,7 @@ Use the following Cypher statement as the first parameter to the `read()` functi
124124
.Get Conversation History
125125
[source,cypher]
126126
----
127-
include::{repository-raw}/{branch}/cypher/get-history.cypher[]
127+
include::{repository-raw}/main/cypher/get-history.cypher[]
128128
----
129129

130130
Your code should resemble the following:
@@ -133,7 +133,7 @@ Your code should resemble the following:
133133
.Return the messages
134134
[source,typescript,indent=0]
135135
----
136-
include::{repository-raw}/{branch}/src/solutions/modules/agent/history.ts[tag=gettx]
136+
include::{repository-raw}/main/src/solutions/modules/agent/history.ts[tag=gettx]
137137
----
138138

139139

@@ -142,7 +142,7 @@ Finally, you can return the `res` variable.
142142
.Return the messages
143143
[source,typescript,indent=0]
144144
----
145-
include::{repository-raw}/{branch}/src/solutions/modules/agent/history.ts[tag=getreturn]
145+
include::{repository-raw}/main/src/solutions/modules/agent/history.ts[tag=getreturn]
146146
----
147147

148148

@@ -153,7 +153,7 @@ include::{repository-raw}/{branch}/src/solutions/modules/agent/history.ts[tag=ge
153153
.The Implemented getHistory() Function
154154
[source,typescript]
155155
----
156-
include::{repository-raw}/{branch}/src/solutions/modules/agent/history.ts[tag=get]
156+
include::{repository-raw}/main/src/solutions/modules/agent/history.ts[tag=get]
157157
----
158158
159159
====

asciidoc/courses/llm-chatbot-typescript/modules/3-conversation-history/lessons/4-rephrase-chain/lesson.adoc

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ The chain will accept the following input:
1616
.Chain Input
1717
[source,typescript]
1818
----
19-
include::{repository-raw}/{branch}/{lab-solution}[tag=interface]
19+
include::{repository-raw}/main/{lab-solution}[tag=interface]
2020
----
2121

2222

@@ -44,7 +44,7 @@ Use the `PromptTemplate.fromTemplate()` static method to create a new prompt tem
4444
[source]
4545
.Rephrase Question Prompt
4646
----
47-
include::{repository-raw}/{branch}/prompts/{prompt-filename}[]
47+
include::{repository-raw}/main/prompts/{prompt-filename}[]
4848
----
4949

5050
Your code should resemble the following:
@@ -53,7 +53,7 @@ Your code should resemble the following:
5353
.Prompt Template
5454
[source,typescript,indent=0]
5555
----
56-
include::{repository-raw}/{branch}/{lab-solution}[tag=prompt]
56+
include::{repository-raw}/main/{lab-solution}[tag=prompt]
5757
----
5858

5959

@@ -73,7 +73,7 @@ Use the `return` keyword to return the sequence from the function.
7373
.Full Sequence
7474
[source,typescript]
7575
----
76-
include::{repository-raw}/{branch}/{lab-solution}[tag=sequence,indent=0]
76+
include::{repository-raw}/main/{lab-solution}[tag=sequence,indent=0]
7777
----
7878

7979
[NOTE]
@@ -88,7 +88,7 @@ In the following code, the `.map()` method uses the `input` and `output` propert
8888
.Reformatting Messages
8989
[source,typescript]
9090
----
91-
include::{repository-raw}/{branch}/{lab-solution}[tag=assign,indent=0]
91+
include::{repository-raw}/main/{lab-solution}[tag=assign,indent=0]
9292
----
9393
====
9494

@@ -99,7 +99,7 @@ You could initialize and run the chain with the following code:
9999

100100
[source,typescript, role=nocopy]
101101
----
102-
include::{repository-raw}/{branch}/{lab-solution}[tag=usage]
102+
include::{repository-raw}/main/{lab-solution}[tag=usage]
103103
----
104104

105105
include::../../../../includes/test.adoc[leveloffset=+1]

asciidoc/courses/llm-chatbot-typescript/modules/4-vector-retrieval/lessons/1-vector-stores/lesson.adoc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -52,15 +52,15 @@ Inside `{lab-file}`, you will find an `initVectorStore()` function.
5252
.initVectorStore
5353
[source,typescript]
5454
----
55-
include::{repository-raw}/{branch}/src/{lab-file}[tag=function]
55+
include::{repository-raw}/main/src/{lab-file}[tag=function]
5656
----
5757

5858
Inside this function, use the `Neo4jVectorStore.fromExistingIndex()` method to create a new vector store instance.
5959

6060
.Using an existing index
6161
[source,typescript]
6262
----
63-
include::{repository-raw}/{branch}/{lab-solution}[tag=store, indent=0]
63+
include::{repository-raw}/main/{lab-solution}[tag=store, indent=0]
6464
----
6565

6666
[TIP]
@@ -80,15 +80,15 @@ Finally, return the `vectorStore` from the function.
8080
.Returning the vector store
8181
[source,typescript]
8282
----
83-
include::{repository-raw}/{branch}/{lab-solution}[tag=return,indent=0]
83+
include::{repository-raw}/main/{lab-solution}[tag=return,indent=0]
8484
----
8585

8686
If you have followed the steps correctly, your code should resemble the following:
8787

8888
.Returning the vector store
8989
[source,typescript]
9090
----
91-
include::{repository-raw}/{branch}/{lab-solution}[tag=function,indent=0]
91+
include::{repository-raw}/main/{lab-solution}[tag=function,indent=0]
9292
----
9393

9494

0 commit comments

Comments
 (0)