Skip to content

Commit e80ab50

Browse files
kgilpinDaniel-Warner-X
authored andcommitted
feat: Sync docs from applandinc.appmap.io (#2024)
Co-authored-by: Daniel-Warner-X <1229326+Daniel-Warner-X@users.noreply.github.com>
1 parent 3af155a commit e80ab50

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+1062
-568
lines changed

docs/appmap-docs.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ By using AppMap data, Navie is the first AI code architect with the context to u
1818

1919
Over 90,000 software developers are using the [AppMap extension for VSCode](https://marketplace.visualstudio.com/items?itemName=appland.appmap) and the [AppMap plugin for JetBrains](https://plugins.jetbrains.com/plugin/16701-appmap).
2020

21-
<a class="btn btn-primary btn-lg" href="/docs/get-started-with-appmap/">Get Started with AppMap</a>
21+
<a class="btn btn-primary btn-lg" href="/docs/get-started-with-appmap/">Get Started</a>
2222

2323
![AppMap Navie with Sequence diagram in Visual Studio Code](/assets/img/docs/vscode-with-navie-prompt.webp)
2424
_AppMap Navie with Sequence diagram in Visual Studio Code_

docs/community.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ layout: docs
33
toc: true
44
title: Docs - Community
55
description: "Join AppMap's vibrant community on Slack for discussions, issue reporting, and become a contributor."
6-
redirect_from: [/docs/troubleshooting]
6+
redirect_from: [/docs/troubleshooting, /community]
77
---
88
# Community
99

docs/get-started-with-appmap/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ toc: true
66
redirect_from: [/docs/your-first-15-minutes-with-appmap/, /docs/code-editor-extensions/,/docs/code-editor-extensions/appmap-for-vs-code, /docs/code-editor-extensions/appmap-for-jetbrains,/docs/setup-appmap-in-your-code-editor/index.html]
77
---
88

9-
# Get Started with AppMap
9+
# Get Started
1010

1111
<p class="alert alert-info">
1212
If at any point you would like some help, <a href="/slack">join us in Slack</a>!

docs/get-started-with-appmap/navie-ai-quickstart.md

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ redirect_from: [/docs/setup-appmap-in-your-code-editor/navie-ai-quickstart]
2020

2121
By default, Navie uses an AppMap proxy of the latest OpenAI supported AI models. If you would like to customize your own model, you can leverage a variety of other AI model providers such as [Azure OpenAI](https://appmap.io/docs/navie-reference#azure-openai), [Fireworks.ai](https://appmap.io/docs/navie-reference#fireworks-ai), [LM Studio](https://appmap.io/docs/navie-reference#lm-studio), and more.
2222

23-
If you have an active GitHub Copilot subscription, you can use Navie with the [Copilot Lanauage Model](/docs/navie-reference#github-copilot-language-model) as a supported backend. Refer to the Navie Copilot documentation for instructions on how to enable.
23+
If you have an active GitHub Copilot subscription, you can use Navie with the [Copilot Lanauage Model](/docs/navie-reference/navie-bring-your-own-model-examples.html#github-copilot-language-model) as a supported backend. Refer to the [Navie Copilot documentation](/docs/navie-reference/navie-bring-your-own-model-examples.html#github-copilot-language-model) for instructions on how to enable.
2424

2525
## Open AppMap Navie AI
2626

@@ -32,20 +32,21 @@ To open the Navie Chat, open the AppMap plugin in the sidebar menu for your code
3232

3333
## Ask Navie about your App
3434

35-
You can ask questions about your application with Navie immediately after installing the plugin. AppMap Data is not required but Navie only has partial information about your project and the answers will not include any runtime specific information.
35+
You can ask questions about your application with Navie immediately after installing the plugin. Navie will answer questions based on analysis of your project code. For increased accuracy of more complex projects, you can record AppMap data and Navie will utilize this information as well.
3636

3737
By default, Navie will utilize an OpenAI service hosted by AppMap. If, for data privacy or other reasons, you are do not wish to use the AppMap OpenAI proxy, you can [bring your own OpenAI API key](/docs/using-navie-ai/bring-your-own-model.html#bring-your-own-openai-api-key-byok), or use an [entirely different AI Model](/docs/using-navie-ai/bring-your-own-model.html#ollama), hosted in your environment or hosted locally.
3838

39-
When you ask a question to Navie, it will search through all your AppMap Diagrams (if they exist) for your project to pull in relevant traces, sequence diagrams, and code snippets for analysis. It will then send these code snippets and runtime code sequence diagrams to the Generative AI service along with your question.
39+
When you ask a question to Navie, it will search through all the available AppMap data for your project to pull in relevant traces, sequence diagrams, and code snippets for analysis. it will send the selected context to your preferred LLM provider.
4040

41-
Refer to the [Using Navie docs](/docs/using-navie-ai/using-navie) to learn more about the advanced Navie chat commands you can use with your question.
41+
To achieve the highest quality results, we suggest using the available command modes when prompting Navie. Simply type `@` into the chat input to access the list of available command modes.
4242

43-
After asking Navie a question, Navie will search through your application source code, finding any relevant code snippets. It will include relevant AppMap Data like sequence diagrams and data flows if they exist for your project. You will see on the right hand side of the Navie window the relevant context from your code included with the question.
43+
By default, Navie chat is in a default mode called `@explain`. Other specialized modes are available for generating diagrams, planning work, generating code and tests, and more. Consult [Using Navie docs](/docs/navie-reference/navie-commands.html) for more details on Navie commands.
4444

4545
The Navie UI includes a standard chat window, and a context panel which will include all the context that is included in the query to the AI provider. This context can include things such as:
4646

4747
**Always available:**
4848
- Code Snippets
49+
- Pinned Content
4950

5051
**If AppMap Data exists:**
5152
- Sequence Diagrams

docs/navie-reference.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@ layout: docs
33
title: Docs - Reference
44
toc: true
55
description: "Reference Guide to AppMap Navie, including advanced usage and configuration"
6+
redirect_for: [/docs/reference/navie]
67
---
78

89
# Navie Reference

docs/navie-reference/index.md

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
---
2+
layout: docs
3+
title: Docs - Reference
4+
description: "A reference for AppMap Navie AI"
5+
toc: true
6+
step: 1
7+
---
8+
9+
# Navie Reference
10+
- [Navie Commands](/docs/navie-reference/navie-commands.html)
11+
- [Navie Options](/docs/navie-reference/navie-options.html)
12+
- [Bring Your Own Model Examples](/docs/navie-reference/navie-bring-your-own-model-examples.html)
13+
- [OpenAI Key Management](/docs/navie-reference/navie-openai-key-management.html)
14+
- [Accessing Navie Logs](/docs/navie-reference/navie-accessing-logs.html)
15+
- [GitHub Repository](/docs/navie-reference/navie-github-repository.html)
16+
- [How Navie Works](/docs/navie-reference/navie-how-it-works.html)
17+
- [Navie User Interface](/docs/navie-reference/navie-user-interface.html)
18+
- [Pre-built Libraries for Recording AppMap Data](/docs/navie-reference/navie-pre-built-libraries-for-appmap-data.html)
Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
---
2+
layout: docs
3+
title: Docs - Reference
4+
name: Accessing Navie Logs
5+
toc: true
6+
step: 7
7+
navie-reference: true
8+
description: "Reference Guide to AppMap Navie AI, how-to guide for accessing logs."
9+
---
10+
11+
12+
# Accessing Navie Logs
13+
14+
## Visual Studio Code
15+
16+
You can access the Navie logs in VS Code by opening the `Output` tab and selecting `AppMap Services` from the list of available output logs.
17+
18+
To open the Output window, on the menu bar, choose View > Output, or in Windows press `Ctrl+Shift+U` or in Mac use `Shift+Command+U`
19+
20+
![Open View in VS Code](/assets/img/docs/vscode-output-1.webp)
21+
22+
Click on the output log dropdown in the right corner to view a list of all the available output logs.
23+
24+
![Open Output logs list](/assets/img/docs/vscode-output-2.webp)
25+
26+
Select on the `AppMap: Services` log to view the logs from Navie.
27+
28+
![Select AppMap Services](/assets/img/docs/vscode-output-3.webp)
29+
30+
## JetBrains
31+
32+
You can enable debug logging of Navie in your JetBrains code editor by first opening `Help` > `Diagnostic Tools` > `Debug Log Settings`.
33+
34+
![JetBrains Debug Log menu](/assets/img/jetbrains-debug-logs.webp)
35+
36+
In the `Custom Debug Log Configuration` enter `appland` to enable DEBUG level logging for the AppMap plugin.
37+
38+
![JetBrains Debug Log Configuration](/assets/img/jetbrains-logging-configuration.webp)
39+
40+
Next, open `Help` > `Show Log...` will open the IDE log file.
41+
42+
![JetBrains Debug Show Log](/assets/img/jetbrains-show-log.webp)
Lines changed: 202 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,202 @@
1+
---
2+
layout: docs
3+
title: Docs - AppMap Navie
4+
description: "Reference Guide to AppMap Navie AI, examples of bring-your-own-llm configurations."
5+
name: Bring Your Own Model Examples
6+
navie-reference: true
7+
toc: true
8+
step: 5
9+
---
10+
11+
# Bring Your Own Model Examples
12+
13+
## GitHub Copilot Language Model
14+
15+
Starting with VS Code `1.91` and greater, and with an active GitHub Copilot subscription, you can use Navie with the Copilot Language Model as a supported backend model. This allows you to leverage the powerful runtime powered Navie AI Architect with your existing Copilot subscription. This is the recommended option for users in corporate environments where Copilot is the only approved and supported language model.
16+
17+
#### Requirements <!-- omit in toc -->
18+
19+
The following items are required to use the GitHub Copilot Language Model with Navie:
20+
21+
- VS Code Version `1.91` or greater
22+
- AppMap Extension version `v0.123.0` or greater
23+
- GitHub Copilot VS Code extension must be installed
24+
- Signed into an active paid or trial GitHub Copilot subscription
25+
26+
#### Setup <!-- omit in toc -->
27+
28+
Open the VS Code Settings, and search for `navie vscode`
29+
30+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-1.webp"/>
31+
32+
Click the box to use the `VS Code language model...`
33+
34+
After clicking the box to enable the VS Code LM, you'll be instructed to reload your VS Code to enable these changes.
35+
36+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-2.webp"/>
37+
38+
After VS Code finishes reloading, open the AppMap extension.
39+
40+
Select `New Navie Chat`, and confirm the model listed is `(via copilot)`
41+
42+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-3.webp"/>
43+
44+
You'll need to allow the AppMap extension access to the Copilot Language Models. After asking your first question to Navie, click `Allow` to the popup to allow the necessary access.
45+
46+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-4.webp"/>
47+
48+
#### Troubleshooting <!-- omit in toc -->
49+
50+
If you attempt to enable the Copilot language models without the Copilot Extension installed, you'll see the following error in your code editor.
51+
52+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-5.webp"/>
53+
54+
Click `Install Copilot` to complete the installation for language model support.
55+
56+
If you have the Copilot extension installed, but have not signed in, you'll see the following notice.
57+
58+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-6.webp"/>
59+
60+
Click the `Sign in to GitHub` and login with an account that has a valid paid or trial GitHub Copilot subscription.
61+
62+
#### Video Demo <!-- omit in toc -->
63+
64+
{% include vimeo.html id='992238965' %}
65+
66+
## OpenAI
67+
68+
**Note:** We recommend configuring your OpenAI key using the code editor extension. Follow the [Bring Your Own Key](/docs/using-navie-ai/bring-your-own-model.html#configuring-your-openai-key) docs for instructions.
69+
70+
Only `OPENAI_API_KEY` needs to be set, other settings can stay default:
71+
72+
| `OPENAI_API_KEY`| `sk-9spQsnE3X7myFHnjgNKKgIcGAdaIG78I3HZB4DFDWQGM` |
73+
74+
When using your own OpenAI API key, you can also modify the OpenAI model for Navie to use. For example if you wanted to use `gpt-3.5` or use an preview model like `gpt-4-vision-preview`.
75+
76+
| `APPMAP_NAVIE_MODEL`| `gpt-4-vision-preview` |
77+
78+
### Anthropic (Claude)
79+
80+
AppMap supports the Anthropic suite of large language models such as Claude Sonnet or Claude Opus.
81+
82+
To use AppMap Navie with Anthropic LLMs you need to generate an API key for your account.
83+
84+
Login to your [Anthropic dashboard](https://console.anthropic.com/dashboard), and choose the option to "Get API Keys"
85+
86+
Click the box to "Create Key"
87+
88+
![Anthropic Create Key](/assets/img/product/create-anthropic-key.webp)
89+
90+
In the next box, give your key an easy to recognize name.
91+
92+
![Anthropic Key Name](/assets/img/product/give-anthropic-key-name.webp)
93+
94+
In your VS Code or JetBrains editor, configure the following environment variables. For more details on configuring
95+
these environment variables in your VS Code or JetBrains editor, refer to the [AppMap BOYK documentation.](/docs/using-navie-ai/bring-your-own-model.html#configuration)
96+
97+
| `ANTHROPIC_API_KEY`| `sk-ant-api03-8SgtgQrGB0vTSsB_DeeIZHvDrfmrg` |
98+
| `APPMAP_NAVIE_MODEL`| `claude-3-5-sonnet-20240620` |
99+
100+
101+
When setting the `APPMAP_NAVIE_MODEL` refer to the [Anthropic documentation](https://docs.anthropic.com/en/docs/intro-to-claude#model-options) for the latest available models to chose from.
102+
103+
#### Video Demo <!-- omit in toc -->
104+
105+
{% include vimeo.html id='1003330117' %}
106+
107+
## Azure OpenAI
108+
109+
Assuming you [created](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource) a `navie` GPT-4 deployment on `contoso.openai.azure.com` OpenAI instance:
110+
111+
| `AZURE_OPENAI_API_KEY` | `e50edc22e83f01802893d654c4268c4f` |
112+
| `AZURE_OPENAI_API_VERSION` | `2024-02-01` |
113+
| `AZURE_OPENAI_API_INSTANCE_NAME` | `contoso` |
114+
| `AZURE_OPENAI_API_DEPLOYMENT_NAME` | `navie` |
115+
116+
## AnyScale Endpoints
117+
118+
[AnyScale Endpoints](https://www.anyscale.com/endpoints) allows querying a
119+
selection of open-source LLMs. After you create an account you can use it by
120+
setting:
121+
122+
| `OPENAI_API_KEY` | `esecret_myxfwgl1iinbz9q5hkexemk8f4xhcou8` |
123+
| `OPENAI_BASE_URL` | `https://api.endpoints.anyscale.com/v1` |
124+
| `APPMAP_NAVIE_MODEL` | `mistralai/Mixtral-8x7B-Instruct-v0.1` |
125+
126+
Consult [AnyScale documentation](https://docs.endpoints.anyscale.com/) for model
127+
names. Note we recommend using Mixtral models with Navie.
128+
129+
#### Anyscale Demo with VS Code <!-- omit in toc -->
130+
131+
{% include vimeo.html id='970914908' %}
132+
133+
#### Anyscale Demo with JetBrains <!-- omit in toc -->
134+
135+
{% include vimeo.html id='970914884' %}
136+
137+
## Fireworks AI
138+
139+
You can use [Fireworks AI](https://fireworks.ai/) and their serverless or on-demand
140+
models as a compatible backend for AppMap Navie AI.
141+
142+
After creating an account on Fireworks AI you can configure your Navie environment
143+
settings:
144+
145+
| `OPENAI_API_KEY` | `WBYq2mKlK8I16ha21k233k2EwzGAJy3e0CLmtNZadJ6byfpu7c` |
146+
| `OPENAI_BASE_URL` | `https://api.fireworks.ai/inference/v1` |
147+
| `APPMAP_NAVIE_MODEL` | `accounts/fireworks/models/mixtral-8x22b-instruct` |
148+
149+
Consult the [Fireworks AI documentation](https://fireworks.ai/models) for a full list of
150+
the available models they currently support.
151+
152+
#### Video Demo <!-- omit in toc -->
153+
154+
{% include vimeo.html id='992941358' %}
155+
156+
### Ollama
157+
158+
You can use [Ollama](https://ollama.com/) to run Navie with local models; after
159+
you've successfully ran a model with `ollama run` command, you can configure
160+
Navie to use it:
161+
162+
| `OPENAI_API_KEY` | `dummy` |
163+
| `OPENAI_BASE_URL` | `http://127.0.0.1:11434/v1` |
164+
| `APPMAP_NAVIE_MODEL` | `mixtral` |
165+
166+
**Note:** Even though it's running locally a dummy placeholder API key is still required.
167+
168+
## LM Studio
169+
170+
You can use [LM Studio](https://lmstudio.ai/) to run Navie with local models.
171+
172+
After downloading a model to run, select the option to run a local server.
173+
174+
<img class="video-screenshot" src="/assets/img/product/lmstudio-run-local-server.webp"/>
175+
176+
In the next window, select which model you want to load into the local inference server.
177+
178+
<img class="video-screenshot" src="/assets/img/product/lmstudio-load-model.webp"/>
179+
180+
After loading your model, you can confirm it's successfully running in the logs.
181+
182+
*NOTE*: Save the URL it's running under to use for `OPENAI_BASE_URL` environment variable.
183+
184+
For example: `http://localhost:1234/v1`
185+
186+
<img class="video-screenshot" src="/assets/img/product/lmstudio-confirm-running.webp"/>
187+
188+
In the `Model Inspector` copy the name of the model and use this for the `APPMAP_NAVIE_MODEL` environment variable.
189+
190+
For example: `Meta-Llama-3-8B-Instruct-imatrix`
191+
192+
<img class="video-screenshot" src="/assets/img/product/lmstudio-model-inspector.webp"/>
193+
194+
Continue to configure your local environment with the following environment variables based on your LM Studio configuration. Refer to the [documentation above](#bring-your-own-model-byom) for steps specific to your code editor.
195+
196+
| `OPENAI_API_KEY` | `dummy` |
197+
| `OPENAI_BASE_URL` | `http://localhost:1234/v1` |
198+
| `APPMAP_NAVIE_MODEL` | `Meta-Llama-3-8B-Instruct-imatrix` |
199+
200+
**Note:** Even though it's running locally a dummy placeholder API key is still required.
201+
202+
{% include vimeo.html id='969002308' %}

0 commit comments

Comments
 (0)