Skip to content

Large SYSTEM_PROMPT causing context overflow #281

@iaminnasr

Description

@iaminnasr

Problem Description

The copilot system is experiencing context overflow issues due to an extremely large SYSTEM_PROMPT that gets added to every chat conversation. This is causing:

  • Context limit exhaustion - The prompt fills up available context quickly
  • Performance degradation - Slower response times
  • Higher API costs - More tokens per request
  • Poor user experience - Copilot becomes unresponsive

Root Cause Analysis

The SYSTEM_PROMPT in /apps/rowboat/src/application/lib/copilot/copilot.ts includes:

  1. COPILOT_INSTRUCTIONS_MULTI_AGENT (263 lines) - ✅ Acceptable
  2. COPILOT_MULTI_AGENT_EXAMPLE_1 (1,329 lines) - ❌ MAJOR ISSUE
  3. CURRENT_WORKFLOW_PROMPT (13 lines) - ✅ Acceptable
  4. Auto-loaded docs from copilot_multi_agent.ts - ❌ POTENTIAL ISSUE

Total context size: ~1,600+ lines added to every chat!

Critical Issue: Token Consumption

Current Problem: With a simple chat, we're hitting the 30,000 token limit immediately due to the massive SYSTEM_PROMPT. This means:

  • Users can't have meaningful conversations
  • Copilot becomes unresponsive after just a few messages
  • The system is essentially unusable for any real workflow

Token Breakdown:

  • SYSTEM_PROMPT: ~25,000+ tokens (1,600+ lines)
  • User message: ~100-500 tokens
  • Total: 30,000+ tokens (hitting the limit!)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions