Skip to content

Conversation

@micli
Copy link

@micli micli commented Sep 19, 2025

When accessing outbound traffic through a proxy server, a --proxy parameter has been added to configure the web proxy during benchmarking. You can added below content into launch.json for testing proxy server feature.

{
    "version": "0.2.0",
    "configurations": [

        {
            "name": "Python Debugging: Module",
            "type": "debugpy",
            "request": "launch",
            "module": "benchmark.bench",
            "env": {
                "OPENAI_API_KEY": "<Azure OpenAI Key>"
            },
            "args": [
                "load", 
                "--requests", "1000", 
                "--deployment", "gpt-4.1", 
                "--rate", "60", 
                "--retry", 
                "exponential", "https://<AOAI resource name>.openai.azure.com/", 
                "--proxy", "http://<Proxy server name/IP>:<port>"
            ]
        }
    ]
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant