You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: readme.md
+49-18Lines changed: 49 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,15 +16,16 @@ Prompt is essentially a string, but it should behave somewhat differently from a
16
16
17
17
👨 **Role & Concatenation**: Prompt strings should have designated roles (e.g., `system`, `user`, `assistant`) and should be concatenated in a specific manner.
18
18
19
-
🦆 **Binding Functions**: A prompt string contains logic and instructions, so having some binding functions for AI-related stuff is beneficial and necessary (e.g., convert to OpenAI Message Format).
20
19
21
20
21
+
## Features
22
22
23
-
**Few promises in `prompt-string`:**
23
+
`prompt-string` provides two types:
24
24
25
-
-`prompt-string` inherits from `string`. Therefore, aside from the mentioned features, its other behaviors are just like those of a `string` in Python.
26
-
-`prompt-string` won't add OpenAI and other AI SDKs as dependencies; it is simply a toolkit for prompts.
27
-
-`prompt-string` will be super light and fast, with no heavy processes running behind the scenes.
25
+
-`P` for prompt, inherits from `string`. Length, Slicing and concatenation are modified and support new attributes like `.role`.
26
+
-`p = P("You're a helpful assistant")`
27
+
-`PC` for prompt chain, act like `list[P]`. Link a series of prompt and support `.messages(...)`
28
+
-`pc = p1 / p2 / p3`
28
29
29
30
30
31
@@ -50,35 +51,65 @@ print("Decoded result of the second token", prompt[2])
50
51
print("The decoded result of first five tokens", prompt[:5])
51
52
```
52
53
54
+
`P` supports some `str` native methods to still return a `P` object:
55
+
56
+
-`.format`
57
+
-`.replace`
53
58
59
+
```python
60
+
prompt = P("you're a helpful assistant. {temp}")
61
+
62
+
print(len(prompt.format(temp="End of instructions")))
63
+
print(len(prompt.replace("{temp}", ""))
64
+
```
54
65
55
-
#### Role & Concatenation
66
+
> 🧐 Raise an issue if you think other methods should be supported
67
+
68
+
69
+
70
+
#### Role
56
71
57
72
```python
58
73
from prompt_string import P
59
74
60
-
sp = P("you're a helpful assistant.", "system")
61
-
up = P("How are you?", "user")
75
+
sp= P("you're a helpful assistant.", role="system")
76
+
up= P("How are you?", role="user")
62
77
63
-
print(sp.role, up.role, (sp+up).role)
78
+
print(sp.role, up.role, (sp+up).roles)
64
79
print(sp + up)
80
+
81
+
print(sp.message())
65
82
```
66
83
67
-
- role can be `None`, `str`, `list[str]`
84
+
- role can be `None`, `str`for`P`
68
85
- For single prompt, like `sp`, the role is`str`(*e.g.*`system`) or`None`
69
-
- For concatenated prompts, like `sp+up`, the role is `list[str]`(*e.g.*`['system', 'user']`)
86
+
-`sp+up` will concatenate two prompt string and generate a new `P`, whose role will be updated if the latter one has one.
87
+
- For example, `sp+up`'s role is `user`, `sp+P('Hi')`'s role is`system`
70
88
71
89
90
+
-`.message(...)`return a JSONobject of this prompt.
72
91
73
-
#### Binding Functions
74
92
75
-
```python
76
-
from prompt_string import P
77
93
78
-
sp = P("you're a helpful assistant.")
79
-
up = P("How are you?")
94
+
#### Concatenation
80
95
81
-
print((sp+up).messages())
96
+
```python
97
+
pc= sp / up
98
+
print(pc.roles)
99
+
print(pc.messages())
82
100
```
83
101
84
-
-`messages` will return the OpenAI-Compatible messages format, where you can directly pass it to `client.chat.completions.create(messages=...)`
102
+
For concatenated prompts, like `sp / up`, the type will be converted to `PC` (prompt chain), `PC` has below things:
103
+
104
+
-`.roles`, a list of roles. For example, `(sp|up).roles`is`['system', 'user']`
105
+
-`.messages(...)` pack prompts into OpenAI-Compatible messages JSON, where you can directly pass it to `client.chat.completions.create(messages=...)`.
106
+
-`messages` will assume the first role is`user`, then proceed in the order of user-assistant. When a prompt has a role, it will use that role. check `pc.infer_role`for final roles in messages.
107
+
108
+
109
+
110
+
## Few promises in `prompt-string`
111
+
112
+
-`P` inherits from`string`. Therefore, aside from the mentioned features, its other behaviors are just like those of a `string`in Python.
113
+
-`prompt-string` won't add OpenAI and other AI SDKs as dependencies; it is simply a toolkit for prompts.
114
+
-`prompt-string` will be super light and fast, with no heavy processes running behind the scenes.
0 commit comments