Skip to content

Commit 84d7f5a

Browse files
[UT] Fix ut test (#4472)
### What this PR does / why we need it? ### Does this PR introduce _any_ user-facing change? ### How was this patch tested? - vLLM version: v0.11.2 - vLLM main: https://github.com/vllm-project/vllm/commit/v0.11.2 Signed-off-by: hfadzxy <starmoon_zhang@163.com>
1 parent d252e36 commit 84d7f5a

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

tests/ut/attention/test_mla_v1.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -456,6 +456,8 @@ def setUp(self):
456456
@patch("vllm_ascend.attention.mla_v1.get_ascend_config")
457457
def test_build_prefix_no_cache_metadata(self, mock_get_ascend_config,
458458
mock_dcp_world_size):
459+
if not torch.npu.is_available():
460+
self.skipTest("NPU not available, skipping NPU-dependent tests")
459461
mock_dcp_world_size.return_value = 1
460462

461463
common_attn_metadata = AscendCommonAttentionMetadata(
@@ -506,6 +508,8 @@ def test_build_prefix_no_cache_metadata(self, mock_get_ascend_config,
506508
@patch("vllm_ascend.attention.mla_v1.get_ascend_config")
507509
def test_build_chunked_prefix_metadata(self, mock_get_ascend_config,
508510
mock_dcp_world_size):
511+
if not torch.npu.is_available():
512+
self.skipTest("NPU not available, skipping NPU-dependent tests")
509513
mock_dcp_world_size.return_value = 1
510514

511515
common_attn_metadata = AscendCommonAttentionMetadata(

0 commit comments

Comments
 (0)