File tree Expand file tree Collapse file tree 3 files changed +66
-7
lines changed Expand file tree Collapse file tree 3 files changed +66
-7
lines changed Original file line number Diff line number Diff line change @@ -12,7 +12,7 @@ If you want to make your own profile, start with the [Template Profile](https://
1212
1313To apply a Profile to an Open Interpreter session, you can run ` interpreter --profile <name> `
1414
15- # Example Profile
15+ # Example Python Profile
1616
1717``` Python
1818from interpreter import interpreter
@@ -29,6 +29,31 @@ interpreter.auto_run = True
2929interpreter.loop = True
3030```
3131
32+ # Example YAML Profile
33+
34+ <Info > Make sure YAML profile version is set to 0.2.5 </Info >
35+
36+ ``` YAML
37+ llm :
38+ model : " gpt-4-o"
39+ temperature : 0
40+ # api_key: ... # Your API key, if the API requires it
41+ # api_base: ... # The URL where an OpenAI-compatible server is running to handle LLM API requests
42+
43+ # Computer Settings
44+ computer :
45+ import_computer_api : True # Gives OI a helpful Computer API designed for code interpreting language models
46+
47+ # Custom Instructions
48+ custom_instructions : " " # This will be appended to the system message
49+
50+ # General Configuration
51+ auto_run : False # If True, code will run without asking for confirmation
52+ offline : False # If True, will disable some online features like checking for updates
53+
54+ version : 0.2.5 # Configuration file version (do not modify)
55+ ` ` `
56+
3257<Tip>
3358 There are many settings that can be configured. [See them all
3459 here](/settings/all-settings)
Original file line number Diff line number Diff line change 3131 "navigation" : [
3232 {
3333 "group" : " Getting Started" ,
34- "pages" : [" getting-started/introduction" , " getting-started/setup" ]
34+ "pages" : [
35+ " getting-started/introduction" ,
36+ " getting-started/setup"
37+ ]
3538 },
3639 {
3740 "group" : " Guides" ,
4750 },
4851 {
4952 "group" : " Settings" ,
50- "pages" : [" settings/all-settings" ]
53+ "pages" : [
54+ " settings/all-settings"
55+ ]
5156 },
5257 {
5358 "group" : " Language Models" ,
105110 },
106111 {
107112 "group" : " Protocols" ,
108- "pages" : [" protocols/lmc-messages" ]
113+ "pages" : [
114+ " protocols/lmc-messages"
115+ ]
109116 },
110117 {
111118 "group" : " Integrations" ,
112- "pages" : [" integrations/e2b" , " integrations/docker" ]
119+ "pages" : [
120+ " integrations/e2b" ,
121+ " integrations/docker"
122+ ]
113123 },
114124 {
115125 "group" : " Safety" ,
120130 " safety/best-practices"
121131 ]
122132 },
133+ {
134+ "group" : " Troubleshooting" ,
135+ "pages" : [
136+ " troubleshooting/faq"
137+ ]
138+ },
123139 {
124140 "group" : " Telemetry" ,
125- "pages" : [" telemetry/telemetry" ]
141+ "pages" : [
142+ " telemetry/telemetry"
143+ ]
126144 }
127145 ],
128146 "feedback" : {
133151 "youtube" : " https://www.youtube.com/@OpenInterpreter" ,
134152 "linkedin" : " https://www.linkedin.com/company/openinterpreter"
135153 }
136- }
154+ }
Original file line number Diff line number Diff line change 1+ ---
2+ title : " FAQ"
3+ description : " Frequently Asked Questions"
4+ ---
5+
6+ <Accordion title = " Does Open Interpreter ensure that my data doesn't leave my computer?" >
7+ As long as you're using a local language model, your messages / personal info
8+ won't leave your computer. If you use a cloud model, we send your messages +
9+ custom instructions to the model. We also have a basic telemetry
10+ [ function] ( https://github.com/OpenInterpreter/open-interpreter/blob/main/interpreter/core/core.py#L167 )
11+ (copied over from ChromaDB's telemetry) that anonymously tracks usage. This
12+ only lets us know if a message was sent, includes no PII. OI errors will also
13+ be reported here which includes the exception string. Detailed docs on all
14+ this is [ here] ( /telemetry/telemetry ) , and you can opt out by running
15+ ` --local ` , ` --offline ` , or ` --disable_telemetry ` .
16+ </Accordion >
You can’t perform that action at this time.
0 commit comments