-
Notifications
You must be signed in to change notification settings - Fork 364
Open
Description
Problem Description
The copilot system is experiencing context overflow issues due to an extremely large SYSTEM_PROMPT that gets added to every chat conversation. This is causing:
- Context limit exhaustion - The prompt fills up available context quickly
- Performance degradation - Slower response times
- Higher API costs - More tokens per request
- Poor user experience - Copilot becomes unresponsive
Root Cause Analysis
The SYSTEM_PROMPT in /apps/rowboat/src/application/lib/copilot/copilot.ts includes:
COPILOT_INSTRUCTIONS_MULTI_AGENT(263 lines) - ✅ AcceptableCOPILOT_MULTI_AGENT_EXAMPLE_1(1,329 lines) - ❌ MAJOR ISSUECURRENT_WORKFLOW_PROMPT(13 lines) - ✅ Acceptable- Auto-loaded docs from
copilot_multi_agent.ts- ❌ POTENTIAL ISSUE
Total context size: ~1,600+ lines added to every chat!
Critical Issue: Token Consumption
Current Problem: With a simple chat, we're hitting the 30,000 token limit immediately due to the massive SYSTEM_PROMPT. This means:
- Users can't have meaningful conversations
- Copilot becomes unresponsive after just a few messages
- The system is essentially unusable for any real workflow
Token Breakdown:
- SYSTEM_PROMPT: ~25,000+ tokens (1,600+ lines)
- User message: ~100-500 tokens
- Total: 30,000+ tokens (hitting the limit!)
bhupesh-sf
Metadata
Metadata
Assignees
Labels
No labels