Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
74 changes: 72 additions & 2 deletions load-tests/README.MD
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@ usage: stability_test.py [-h] [--url BASE_URL] [--csv CSV_FILE]
[--concurrency CONCURRENCIES] [--duration TEST_DURATION]
[--sla SLA_THRESHOLD] [--error-threshold ERROR_THRESHOLD]
[--skip-header] [-v] [--cooldown COOLDOWN]
[--network NETWORK]
[--endpoints SELECTED_ENDPOINTS] [--list-endpoints]

Cardano Rosetta API Stability Testing Tool
Expand All @@ -102,6 +103,7 @@ options:
--skip-header Skip the header row in the CSV file (default: False)
-v, --verbose Enable verbose output (default: False)
--cooldown COOLDOWN Cooldown period in seconds between endpoint tests (default: 60)
--network NETWORK Network identifier for API requests: mainnet or preprod (default: mainnet)
--endpoints SELECTED_ENDPOINTS
Comma-separated list of endpoint names or paths to test (e.g. "Network Status,Block"
or "/account/balance,/block"). If not specified, all endpoints will be tested.
Expand Down Expand Up @@ -140,10 +142,22 @@ Test only specific endpoints by path:
./load-tests/stability_test.py --endpoints "/network/status,/block,/account/balance"
```

Test only search/transactions endpoint with stake address data:
Test search/transactions by hash lookup:

```bash
./load-tests/stability_test.py --endpoints "/search/transactions" --csv load-tests/data/mainnet-data-stake-address.csv
./load-tests/stability_test.py --endpoints "Search Transactions by Hash"
```

Test search/transactions by address (more resource-intensive):

```bash
./load-tests/stability_test.py --endpoints "Search Transactions by Address" --csv load-tests/data/mainnet-data.csv
```

Test on preprod network:

```bash
./load-tests/stability_test.py --network preprod --url http://127.0.0.1:8082 --csv data/preprod-data.csv
```

List all available endpoints without running tests:
Expand All @@ -164,6 +178,62 @@ Test with custom SLA and error thresholds:
./load-tests/stability_test.py --sla 500 --error-threshold 0.5
```

## Test Data

### CSV Format Requirements

Each CSV row must have 6 fields with specific associations:

```
address,block_index,block_hash,transaction_size,relative_ttl,transaction_hash
```

**Critical Data Associations (Implicit Rules):**

1. **Block Consistency**: The `block_hash` MUST be the hash of the block at `block_index`
2. **Transaction in Block**: The `transaction_hash` MUST exist in the specified block (`block_hash`)
3. **Address in Transaction**: The `address` MUST be involved in the transaction (appear in operations as input/output)
4. **Transaction Size**: The `transaction_size` MUST match the actual size of the transaction in bytes
5. **Valid Address**: The `address` MUST have a balance and UTXO history (for account endpoints)
6. **TTL Value**: The `relative_ttl` is used by construction/metadata endpoint (1000 is standard)

These associations ensure all 8 endpoints can successfully use the same data row:
- Network Status: No specific data needed
- Account Balance/Coins: Requires valid address with balance
- Block: Requires valid block_index and block_hash
- Block Transaction: Requires transaction in specified block
- Search by Hash: Requires valid transaction_hash
- Search by Address: Requires address involved in transactions
- Construction Metadata: Requires transaction_size and relative_ttl

### Available Data Files

The `data/` directory contains pre-validated CSV files for different networks:

### Mainnet Data (`mainnet-data.csv`)
- **Block**: 11573705
- **Transaction**: 3a954835b69ca01ff9cf3b30ce385d5d9ef0cea502bd0f2ad156684dfbaf325a
- **Address**: addr1qxw5ly68dml8ceg7eawa7we8pjw8j8hn74n2djt2upmnq9th42p6lrke4yj3e0xqg3sdqm6lzksa53wd2550vrpkedks4fttnm

### Preprod Data (`preprod-data.csv`)
- **Block**: 4070700
- **Transaction**: bf540a825d5d40af7435801ce6adcac010f3f9f29ae102aee8cff8007f68c3d4
- **Address**: addr_test1wzn5ee2qaqvly3hx7e0nk3vhm240n5muq3plhjcnvx9ppjgf62u6a

All data has been validated to work with all 8 stability test endpoints, with proper associations between blocks, transactions, and addresses.

## Endpoint Details

### Search Transactions Endpoints

The stability test includes two variants of the `/search/transactions` endpoint:

1. **Search Transactions by Hash**: Queries transactions using `transaction_identifier`. This is a fast, direct lookup by transaction hash.

2. **Search Transactions by Address**: Queries transactions using `account_identifier` with an address. This is more resource-intensive as it requires scanning transaction operations to find all transactions involving the specified address.

Both endpoints use the same API path (`/search/transactions`) but with different query parameters, allowing independent performance testing of each query pattern.

## Output

The script creates a timestamped directory containing:
Expand Down
2 changes: 2 additions & 0 deletions load-tests/data/preprod-data.csv
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
address,block_index,block_hash,transaction_size,relative_ttl,transaction_hash
addr_test1wzn5ee2qaqvly3hx7e0nk3vhm240n5muq3plhjcnvx9ppjgf62u6a,4070700,6b1b29d0533a86443140a88d3758f26fa9d4a8954363e78818b3235126ba933b,683,1000,bf540a825d5d40af7435801ce6adcac010f3f9f29ae102aee8cff8007f68c3d4
102 changes: 64 additions & 38 deletions load-tests/stability_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,10 @@ def parse_args():
help='Cooldown period in seconds between endpoint tests')
parser.add_argument('--max-retries', dest='max_retries', type=int, default=2,
help='Maximum number of retries when an ab command fails')

parser.add_argument('--network', dest='network', default="mainnet",
choices=['mainnet', 'preprod'],
help='Network identifier for API requests')

# Endpoint selection
parser.add_argument('--endpoints', dest='selected_endpoints', type=str,
help='Comma-separated list of endpoint names or paths to test (e.g. "Network Status,Block" or "/account/balance,/block"). If not specified, all endpoints will be tested.')
Expand Down Expand Up @@ -104,6 +107,7 @@ def parse_args():
VERBOSE = args.verbose
COOLDOWN_PERIOD = args.cooldown
MAX_RETRIES = args.max_retries
NETWORK_ID = args.network

# Global logger variable
logger = None
Expand Down Expand Up @@ -200,7 +204,6 @@ def parse_ab_output(ab_stdout: str):
requests_per_sec = 0.0
mean_time = 0.0
non_2xx_responses = 0
failed_requests = 0

# Parse each metric
for line in ab_stdout.splitlines():
Expand Down Expand Up @@ -234,13 +237,8 @@ def parse_ab_output(ab_stdout: str):
parts = line.split()
if len(parts) >= 3:
non_2xx_responses = int(parts[2])
# Parse Failed requests
elif "Failed requests:" in line:
parts = line.split()
if len(parts) >= 3:
failed_requests = int(parts[2])

return p95, p99, complete_requests, requests_per_sec, mean_time, non_2xx_responses, failed_requests
return p95, p99, complete_requests, requests_per_sec, mean_time, non_2xx_responses

###############################################################################
# PAYLOAD GENERATORS
Expand All @@ -252,14 +250,14 @@ def payload_network_status(*_):
"""
/network/status does not really need CSV data.
"""
return dedent("""\
{
"network_identifier": {
return dedent(f"""\
{{
"network_identifier": {{
"blockchain": "cardano",
"network": "mainnet"
},
"metadata": {}
}
"network": "{NETWORK_ID}"
}},
"metadata": {{}}
}}
""")

def payload_account_balance(address, *_):
Expand All @@ -270,7 +268,7 @@ def payload_account_balance(address, *_):
{{
"network_identifier": {{
"blockchain": "cardano",
"network": "mainnet"
"network": "{NETWORK_ID}"
}},
"account_identifier": {{
"address": "{address}"
Expand All @@ -286,7 +284,7 @@ def payload_account_coins(address, *_):
{{
"network_identifier": {{
"blockchain": "cardano",
"network": "mainnet"
"network": "{NETWORK_ID}"
}},
"account_identifier": {{
"address": "{address}"
Expand All @@ -303,7 +301,7 @@ def payload_block(_addr, block_index, block_hash, *_):
{{
"network_identifier": {{
"blockchain": "cardano",
"network": "mainnet"
"network": "{NETWORK_ID}"
}},
"block_identifier": {{
"index": {block_index},
Expand All @@ -320,7 +318,7 @@ def payload_block_transaction(_addr, block_index, block_hash, _tx_size, _ttl, tr
{{
"network_identifier": {{
"blockchain": "cardano",
"network": "mainnet"
"network": "{NETWORK_ID}"
}},
"block_identifier": {{
"index": {block_index},
Expand All @@ -332,22 +330,38 @@ def payload_block_transaction(_addr, block_index, block_hash, _tx_size, _ttl, tr
}}
""")

def payload_search_transactions(_addr, _block_index, _block_hash, _tx_size, _ttl, transaction_hash):
def payload_search_transactions_by_hash(_addr, _block_index, _block_hash, _tx_size, _ttl, transaction_hash):
"""
/search/transactions requires transaction_hash.
"""
return dedent(f"""\
{{
"network_identifier": {{
"blockchain": "cardano",
"network": "mainnet"
"network": "{NETWORK_ID}"
}},
"transaction_identifier": {{
"hash": "{transaction_hash}"
}}
}}
""")

def payload_search_transactions_by_address(address, *_):
"""
/search/transactions with account_identifier (address-based query).
"""
return dedent(f"""\
{{
"network_identifier": {{
"blockchain": "cardano",
"network": "{NETWORK_ID}"
}},
"account_identifier": {{
"address": "{address}"
}}
}}
""")

def payload_construction_metadata(_addr, _block_index, _block_hash, transaction_size, relative_ttl, _tx_hash):
"""
/construction/metadata requires transaction_size, relative_ttl
Expand All @@ -356,7 +370,7 @@ def payload_construction_metadata(_addr, _block_index, _block_hash, transaction_
{{
"network_identifier": {{
"blockchain": "cardano",
"network": "mainnet"
"network": "{NETWORK_ID}"
}},
"options": {{
"transaction_size": {transaction_size},
Expand All @@ -368,15 +382,16 @@ def payload_construction_metadata(_addr, _block_index, _block_hash, transaction_
###############################################################################
# ENDPOINT DEFINITION
###############################################################################
# We'll define 7 endpoints with: (Name, Path, Payload Generator Function)
# We'll define 8 endpoints with: (Name, Path, Payload Generator Function)
ENDPOINTS = [
("Network Status", "/network/status", payload_network_status),
("Account Balance", "/account/balance", payload_account_balance),
("Account Coins", "/account/coins", payload_account_coins),
("Block", "/block", payload_block),
("Block Transaction", "/block/transaction", payload_block_transaction),
("Search Transactions", "/search/transactions", payload_search_transactions),
("Construction Metadata","/construction/metadata", payload_construction_metadata),
("Network Status", "/network/status", payload_network_status),
("Account Balance", "/account/balance", payload_account_balance),
("Account Coins", "/account/coins", payload_account_coins),
("Block", "/block", payload_block),
("Block Transaction", "/block/transaction", payload_block_transaction),
("Search Transactions by Hash", "/search/transactions", payload_search_transactions_by_hash),
("Search Transactions by Address", "/search/transactions", payload_search_transactions_by_address),
("Construction Metadata", "/construction/metadata", payload_construction_metadata),
]

###############################################################################
Expand Down Expand Up @@ -493,7 +508,13 @@ def test_endpoint(endpoint_name, endpoint_path, payload_func, csv_row):
# Example CSV columns:
# address, block_index, block_hash, transaction_size, relative_ttl, transaction_hash
#
# Adjust if your CSV has different columns or order.
# Validate CSV structure
if len(csv_row) != 6:
logger.error(f"Invalid CSV format for endpoint {endpoint_name}.")
logger.error(f"Expected 6 columns (address, block_index, block_hash, transaction_size, relative_ttl, transaction_hash)")
logger.error(f"Got {len(csv_row)} columns: {csv_row}")
sys.exit(1)

address, block_index, block_hash, transaction_size, relative_ttl, transaction_hash = csv_row

# Generate JSON payload
Expand Down Expand Up @@ -546,9 +567,13 @@ def test_endpoint(endpoint_name, endpoint_path, payload_func, csv_row):
if VERBOSE:
# Format each line with box borders
if line_stripped:
# Fixed width approach
# Truncate long lines to fit box width
max_content_width = box_width - 4 # 2 for borders, 2 for padding
if len(line_stripped) > max_content_width:
line_stripped = line_stripped[:max_content_width - 3] + "..."
content = "│ " + line_stripped
logger.debug(content + " " * (box_width - len(content) - 1) + "│")
padding = " " * (box_width - len(content) - 1)
logger.debug(content + padding + "│")
else:
logger.debug("│" + " " * (box_width - 2) + "│")
proc.stdout.close()
Expand Down Expand Up @@ -615,7 +640,7 @@ def test_endpoint(endpoint_name, endpoint_path, payload_func, csv_row):
break

# Parse p95, p99 and additional metrics from the captured stdout
p95, p99, complete_requests, requests_per_sec, mean_time, non_2xx_responses, failed_requests = parse_ab_output(ab_output)
p95, p99, complete_requests, requests_per_sec, mean_time, non_2xx_responses = parse_ab_output(ab_output)

# Calculate error rate as a percentage
error_rate = 0.0
Expand Down Expand Up @@ -748,8 +773,9 @@ def main():
logger.debug(f"Data row: {', '.join(rows[0]) if rows else 'No data available'}")
logger.debug(f"{'-' * 80}")

# For demonstration, pick the *first* row only.
# If you want to test multiple rows, you can loop here or adapt logic.
# Use only the first CSV row for consistency across all endpoints and concurrency levels.
# This ensures that performance comparisons are based on the same data characteristics.
# All endpoints will be tested with identical input data to measure their relative performance.
if not rows:
# Use logger.error and exit
logger.error("No CSV data after skipping header.")
Expand Down Expand Up @@ -911,8 +937,8 @@ def main():
logger.warning("=" * 80)

current_endpoint = None
if 'ep_name' in locals() and 'c' in locals():
current_endpoint = f"{ep_name} at concurrency level {c}"
if 'ep_name' in locals():
current_endpoint = ep_name

if current_endpoint:
logger.warning(f"Test was interrupted while testing: {current_endpoint}")
Expand Down
Loading