You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Then, build and run the app from Xcode by opening `VoiceAgent.xcodeproj`. You may need to adjust your app signing settings to run the app on your device.
25
25
26
26
> [!NOTE]
27
-
> To setup without the LiveKit CLI, clone the repository and then either create a `VoiceAgent/.env.xcconfig` with a `LIVEKIT_SANDBOX_ID` (if using a [Sandbox Token Server](https://cloud.livekit.io/projects/p_/sandbox/templates/token-server)), or open `TokenService.swift`and add your [manually generated](#token-generation) URL and token.
27
+
> To setup without the LiveKit CLI, clone the repository and then either create a `VoiceAgent/.env.xcconfig` with a `LIVEKIT_SANDBOX_ID` (if using a [Sandbox Token Server](https://cloud.livekit.io/projects/p_/sandbox/templates/token-server)), or modify `VoiceAgent/VoiceAgentApp.swift`to replace the `SandboxTokenSource` with a custom token source implementation.
28
28
29
29
## Feature overview
30
30
31
31
This starter app has support for a number of features of the agents framework, and is configurable to easily enable or disable them in code based on your needs as you adapt this template to your own use case.
32
32
33
33
### Text, video, and voice input
34
34
35
-
This app supports text, video, and/or voice input according to the needs of your agent. To update the features enabled in the app, edit `VoiceAgent/VoiceAgentApp.swift` and update `AgentFeatures.current` to include or exclude the features you need.
35
+
This app supports text, video, and/or voice input according to the needs of your agent. To update the features enabled in the app, edit `VoiceAgent/VoiceAgentApp.swift` and update `Features` to include or exclude the features you need.
36
36
37
-
By default, only voiceand text input are enabled.
37
+
By default, all features (voice, video, and text input) are enabled.
38
38
39
39
Available input types:
40
40
-`.voice`: Allows the user to speak to the agent using their microphone. **Requires microphone permissions.**
@@ -43,23 +43,23 @@ Available input types:
43
43
44
44
If you have trouble with screensharing, refer to [the docs](https://docs.livekit.io/home/client/tracks/screenshare/) for more setup instructions.
45
45
46
-
### Preconnect audio buffer
46
+
### Session
47
+
48
+
The app is built on top of two main observable components from the [LiveKit Swift SDK](https://github.com/livekit/client-sdk-swift):
49
+
-`Session` object to connect to the LiveKit infrastructure, interact with the `Agent`, its local state, and send/receive text messages.
50
+
-`LocalMedia` object to manage the local media tracks (audio, video, screen sharing) and their lifecycle.
47
51
48
-
This app uses `withPreConnectAudio` to capture and buffer audio before the room connection completes. This allows the connection to appear "instant" from the user's perspective and makes your app more responsive. To disable this feature, remove the call to `withPreConnectAudio` as below:
- Replace the `withPreConnectAudio { ... }` block with a standard `room.connect` call and enable the microphone after connect, for example:
53
-
- Connect with `connectOptions: .init(enableMicrophone: true)` without wrapping in `withPreConnectAudio`, or
54
-
- Connect with microphone disabled and call `room.localParticipant.setMicrophone(enabled: true)` after connection.
54
+
This app enables `preConnectAudio` by default to capture and buffer audio before the room connection completes. This allows the connection to appear "instant" from the user's perspective and makes your app more responsive. To disable this feature, set `preConnectAudio` to `false` in `SessionOptions` when creating the `Session`.
55
55
56
56
### Virtual avatar support
57
57
58
-
If your agent publishes a [virtual avatar](https://docs.livekit.io/agents/integrations/avatar/), this app will automatically render the avatar’s camera feed in `AgentParticipantView` when available.
58
+
If your agent publishes a [virtual avatar](https://docs.livekit.io/agents/integrations/avatar/), this app will automatically render the avatar's camera feed in `AgentView` when available.
59
59
60
60
## Token generation in production
61
61
62
-
In a production environment, you will be responsible for developing a solution to [generate tokens for your users](https://docs.livekit.io/home/server/generating-tokens/) which is integrated with your authentication solution. You should disable your sandbox token server and modify `TokenService.swift`to use your own token server.
62
+
In a production environment, you will be responsible for developing a solution to [generate tokens for your users](https://docs.livekit.io/home/server/generating-tokens/) which is integrated with your authentication solution. You should replace your `SandboxTokenSource` with an `EndpointTokenSource` or your own `TokenSourceConfigurable` implementation. Additionally, you can use the `.cached()` extension to cache valid tokens and avoid unnecessary token requests.
0 commit comments