⚡️ Speed up method JiraDataSource.get_issue_type_screen_scheme_project_associations by 27%
#554
+80
−32
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
📄 27% (0.27x) speedup for
JiraDataSource.get_issue_type_screen_scheme_project_associationsinbackend/python/app/sources/external/jira/jira.py⏱️ Runtime :
1.39 milliseconds→1.09 milliseconds(best of250runs)📝 Explanation and details
The optimized code achieves a 26% runtime improvement (1.39ms → 1.09ms) through several targeted micro-optimizations focused on the hottest execution paths:
Key Optimizations
1. Eliminated Unnecessary URL Formatting
_safe_format_url()call since_pathis always empty in this endpointself.base_url + rel_path) instead of template formatting saves ~17.6% of original execution time_safe_format_url()for empty params to optimize other callers2. Optimized Dictionary Conversions
projectId: Since this endpoint always receiveslist[int]forprojectId, inlined the comma-joining logic avoiding generic_serialize_value()calls3. Conditional Query Building
startAtandmaxResultsare None (304/306 calls in profiler)4. Reduced Function Call Overhead
self._clientto local variable to avoid repeated attribute accessdict()constructor calls and_as_str_dict()calls for known-empty dictionariesPerformance Impact
The line profiler shows the biggest gains in:
_as_str_dict()calls optimized with fast-pathsTest Case Benefits
The optimizations are particularly effective for:
While throughput remains constant at 76,500 ops/sec, the 26% runtime reduction means lower CPU utilization and better resource efficiency for applications making frequent Jira API calls.
✅ Correctness verification report:
🌀 Generated Regression Tests and Runtime
import asyncio # used to run async functions
import pytest # used for our unit tests
from app.sources.external.jira.jira import JiraDataSource
--- Minimal stubs for required classes (so tests are self-contained) ---
These are minimal implementations to allow the tests to run.
They simulate the behavior of the actual HTTPRequest, HTTPResponse, and JiraClient.
class HTTPRequest:
def init(self, method, url, headers, path_params, query_params, body):
self.method = method
self.url = url
self.headers = headers
self.path_params = path_params
self.query_params = query_params
self.body = body
class HTTPResponse:
def init(self, content, status_code=200):
self.content = content
self.status_code = status_code
class DummyHTTPClient:
def init(self, base_url="http://example.com"):
self._base_url = base_url
self._executed_requests = []
class JiraClient:
def init(self, client):
self.client = client
from app.sources.external.jira.jira import JiraDataSource
---- Basic Test Cases ----
@pytest.mark.asyncio
async def test_basic_single_project_id():
"""Test basic functionality with a single projectId."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([123])
@pytest.mark.asyncio
async def test_basic_multiple_project_ids():
"""Test with multiple projectIds."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([1, 2, 3])
@pytest.mark.asyncio
async def test_basic_with_startAt_and_maxResults():
"""Test with startAt and maxResults provided."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([42], startAt=5, maxResults=10)
@pytest.mark.asyncio
async def test_basic_with_headers():
"""Test with custom headers provided."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([99], headers={"Authorization": "Bearer TOKEN"})
---- Edge Test Cases ----
@pytest.mark.asyncio
async def test_edge_empty_project_id_list():
"""Test with empty projectId list (should serialize to empty string)."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([])
@pytest.mark.asyncio
async def test_edge_project_id_zero_and_negative():
"""Test with zero and negative project IDs."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([0, -1])
@pytest.mark.asyncio
async def test_edge_headers_none_and_empty_dict():
"""Test with headers=None and headers={} (should result in empty headers)."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
response_none = await ds.get_issue_type_screen_scheme_project_associations([1], headers=None)
response_empty = await ds.get_issue_type_screen_scheme_project_associations([1], headers={})
@pytest.mark.asyncio
async def test_edge_invalid_client_raises():
"""Test that ValueError is raised if client is None."""
class BadClient:
def get_client(self):
return None
with pytest.raises(ValueError, match="HTTP client is not initialized"):
JiraDataSource(BadClient())
@pytest.mark.asyncio
async def test_edge_missing_get_base_url_raises():
"""Test that ValueError is raised if client lacks get_base_url."""
class BadHTTPClient:
pass
class BadClient:
def get_client(self):
return BadHTTPClient()
with pytest.raises(ValueError, match="HTTP client does not have get_base_url method"):
JiraDataSource(BadClient())
@pytest.mark.asyncio
async def test_edge_concurrent_execution():
"""Test concurrent execution of multiple requests."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
coros = [
ds.get_issue_type_screen_scheme_project_associations([i])
for i in range(5)
]
responses = await asyncio.gather(*coros)
for idx, resp in enumerate(responses):
pass
---- Large Scale Test Cases ----
@pytest.mark.asyncio
async def test_large_scale_many_project_ids():
"""Test with a large list of project IDs (100 elements)."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
project_ids = list(range(100))
response = await ds.get_issue_type_screen_scheme_project_associations(project_ids)
expected = ",".join(str(i) for i in project_ids)
@pytest.mark.asyncio
async def test_large_scale_many_concurrent_requests():
"""Test many concurrent requests (50)."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
coros = [
ds.get_issue_type_screen_scheme_project_associations([i, i+1])
for i in range(0, 100, 2)
]
responses = await asyncio.gather(coros)
for i, resp in enumerate(responses):
expected = f"{i2},{i*2+1}"
---- Throughput Test Cases ----
@pytest.mark.asyncio
async def test_JiraDataSource_get_issue_type_screen_scheme_project_associations_throughput_small_load():
"""Throughput test: small load (10 requests)."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
coros = [
ds.get_issue_type_screen_scheme_project_associations([i])
for i in range(10)
]
responses = await asyncio.gather(*coros)
for i, resp in enumerate(responses):
pass
@pytest.mark.asyncio
async def test_JiraDataSource_get_issue_type_screen_scheme_project_associations_throughput_medium_load():
"""Throughput test: medium load (50 requests)."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
coros = [
ds.get_issue_type_screen_scheme_project_associations([i, i+1, i+2])
for i in range(0, 50, 3)
]
responses = await asyncio.gather(*coros)
for idx, resp in enumerate(responses):
start = idx * 3
expected = f"{start},{start+1},{start+2}"
@pytest.mark.asyncio
async def test_JiraDataSource_get_issue_type_screen_scheme_project_associations_throughput_large_load():
"""Throughput test: large load (100 requests)."""
client = JiraClient(DummyHTTPClient())
ds = JiraDataSource(client)
coros = [
ds.get_issue_type_screen_scheme_project_associations([i, i+1])
for i in range(0, 100, 2)
]
responses = await asyncio.gather(coros)
for idx, resp in enumerate(responses):
expected = f"{idx2},{idx*2+1}"
---- Async Context Manager Edge Case ----
@pytest.mark.asyncio
async def test_async_context_manager_usage():
"""Test that HTTPClient can be used as an async context manager."""
class ContextHTTPClient(DummyHTTPClient):
async def aenter(self):
return self
async def aexit(self, exc_type, exc_val, exc_tb):
return None
codeflash_output is used to check that the output of the original code is the same as that of the optimized code.
#------------------------------------------------
import asyncio # Used to run async functions
import pytest # Used for our unit tests
from app.sources.external.jira.jira import JiraDataSource
Minimal mock classes to simulate HTTPRequest/HTTPResponse
class HTTPRequest:
def init(self, method, url, headers, path_params, query_params, body):
self.method = method
self.url = url
self.headers = headers
self.path_params = path_params
self.query_params = query_params
self.body = body
class HTTPResponse:
def init(self, data):
self.data = data
Minimal mock JiraRESTClientViaToken
class JiraRESTClientViaToken:
def init(self, base_url, token, token_type="Bearer"):
self.base_url = base_url
self.token = token
self.token_type = token_type
class JiraClient:
def init(self, client):
self.client = client
from app.sources.external.jira.jira import JiraDataSource
------------------ UNIT TESTS ------------------
1. Basic Test Cases
@pytest.mark.asyncio
async def test_basic_single_project_id():
"""Test with a single project ID and no optional parameters."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([123])
@pytest.mark.asyncio
async def test_basic_multiple_project_ids():
"""Test with multiple project IDs."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([1, 2, 3])
@pytest.mark.asyncio
async def test_basic_with_startAt_and_maxResults():
"""Test with startAt and maxResults provided."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([5], startAt=10, maxResults=50)
@pytest.mark.asyncio
async def test_basic_with_headers():
"""Test with custom headers."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([7], headers={"X-Test": "true"})
2. Edge Test Cases
@pytest.mark.asyncio
async def test_edge_empty_project_id_list():
"""Test with empty projectId list."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([])
@pytest.mark.asyncio
async def test_edge_none_client_raises():
"""Test that ValueError is raised if client is None."""
class DummyClient:
def get_client(self):
return None
with pytest.raises(ValueError, match="HTTP client is not initialized"):
JiraDataSource(DummyClient())
@pytest.mark.asyncio
async def test_edge_missing_get_base_url_raises():
"""Test that ValueError is raised if client lacks get_base_url."""
class DummyClient:
def get_client(self):
class NoBaseUrl:
pass
return NoBaseUrl()
with pytest.raises(ValueError, match="HTTP client does not have get_base_url method"):
JiraDataSource(DummyClient())
@pytest.mark.asyncio
async def test_edge_concurrent_execution():
"""Test concurrent execution with different project IDs."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
coros = [
ds.get_issue_type_screen_scheme_project_associations([i])
for i in range(5)
]
responses = await asyncio.gather(*coros)
for idx, resp in enumerate(responses):
pass
@pytest.mark.asyncio
async def test_edge_project_id_with_bool_and_str():
"""Test projectId list with bools and strings (should serialize correctly)."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
response = await ds.get_issue_type_screen_scheme_project_associations([1, True, "3"])
3. Large Scale Test Cases
@pytest.mark.asyncio
async def test_large_scale_many_project_ids():
"""Test with a large number of project IDs."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
project_ids = list(range(100))
response = await ds.get_issue_type_screen_scheme_project_associations(project_ids)
# Should serialize all project IDs as comma-separated string
expected = ",".join(str(i) for i in range(100))
@pytest.mark.asyncio
async def test_large_scale_concurrent_calls():
"""Test many concurrent calls with different project IDs."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
coros = [
ds.get_issue_type_screen_scheme_project_associations([i, i+1])
for i in range(0, 50, 2)
]
responses = await asyncio.gather(coros)
for idx, resp in enumerate(responses):
expected = f"{idx2},{idx*2+1}"
4. Throughput Test Cases
@pytest.mark.asyncio
async def test_JiraDataSource_get_issue_type_screen_scheme_project_associations_throughput_small_load():
"""Throughput test with small load (10 concurrent requests)."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
coros = [
ds.get_issue_type_screen_scheme_project_associations([i])
for i in range(10)
]
responses = await asyncio.gather(*coros)
for idx, resp in enumerate(responses):
pass
@pytest.mark.asyncio
async def test_JiraDataSource_get_issue_type_screen_scheme_project_associations_throughput_medium_load():
"""Throughput test with medium load (50 concurrent requests)."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
coros = [
ds.get_issue_type_screen_scheme_project_associations([i, i+1, i+2])
for i in range(0, 50, 3)
]
responses = await asyncio.gather(coros)
for idx, resp in enumerate(responses):
ids = [idx3, idx3+1, idx3+2]
expected = ",".join(str(i) for i in ids)
@pytest.mark.asyncio
async def test_JiraDataSource_get_issue_type_screen_scheme_project_associations_throughput_large_load():
"""Throughput test with large load (100 concurrent requests)."""
client = JiraClient(JiraRESTClientViaToken("http://test.com/", "token"))
ds = JiraDataSource(client)
coros = [
ds.get_issue_type_screen_scheme_project_associations([i])
for i in range(100)
]
responses = await asyncio.gather(*coros)
for idx, resp in enumerate(responses):
pass
codeflash_output is used to check that the output of the original code is the same as that of the optimized code.
To edit these changes
git checkout codeflash/optimize-JiraDataSource.get_issue_type_screen_scheme_project_associations-mhsb8btmand push.