Skip to content

Commit b7a2574

Browse files
authored
[https://nvbugs/5568991][test] Remove Phi-3 models (#9066)
Signed-off-by: yufeiwu-nv <230315618+yufeiwu-nv@users.noreply.github.com>
1 parent 96132b4 commit b7a2574

File tree

3 files changed

+3
-14
lines changed

3 files changed

+3
-14
lines changed

tests/integration/defs/perf/test_perf.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -125,8 +125,6 @@
125125
"mamba_2.8b": "mamba/mamba-2.8b-hf",
126126
"gpt_20b": "gpt-neox-20b",
127127
"gpt_350m_moe": "gpt2-medium",
128-
"phi_3_mini_4k_instruct": "Phi-3/Phi-3-mini-4k-instruct",
129-
"phi_3_mini_128k_instruct": "Phi-3/Phi-3-mini-128k-instruct",
130128
"phi_4_mini_instruct": "Phi-4-mini-instruct",
131129
"phi_4_multimodal_instruct": "multimodals/Phi-4-multimodal-instruct",
132130
"phi_4_multimodal_instruct_image": "multimodals/Phi-4-multimodal-instruct",

tests/integration/test_lists/qa/llm_perf_core.yml

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -185,10 +185,7 @@ llm_perf_core:
185185
#mixtral_8x7b_v0.1_fp8 pytorch backend
186186
- perf/test_perf.py::test_perf[mixtral_8x7b_v0.1_instruct_fp8-bench-pytorch-float8-input_output_len:128,128-gpus:2]
187187
- perf/test_perf.py::test_perf[mixtral_8x7b_v0.1_instruct_fp8-bench-pytorch-float8-input_output_len:512,32-gpus:2]
188-
#phi_3_mini_128k_instruct
189-
#pytorch backend
190-
- perf/test_perf.py::test_perf[phi_3_mini_128k_instruct-bench-pytorch-float16-maxbs:128-input_output_len:1000,1000-tp:2]
191-
- perf/test_perf.py::test_perf[phi_3_mini_128k_instruct-bench-pytorch-float16-maxbs:128-input_output_len:500,2000-tp:2]
188+
192189

193190
- condition:
194191
terms:

tests/integration/test_lists/qa/llm_perf_nim.yml

Lines changed: 2 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -123,10 +123,7 @@ llm_perf_nim:
123123
#trt backend
124124
- perf/test_perf.py::test_perf[mistral_7b_v0.1-bench-float16-maxbs:256-input_output_len:1000,1000-quant:fp8]
125125
- perf/test_perf.py::test_perf[mistral_7b_v0.1-bench-float16-maxbs:256-input_output_len:500,2000-quant:fp8]
126-
#phi_3_mini_4k_instruct
127-
#trt backend
128-
- perf/test_perf.py::test_perf[phi_3_mini_4k_instruct-bench-float16-maxbs:128-input_output_len:1000,1000-quant:fp8]
129-
- perf/test_perf.py::test_perf[phi_3_mini_4k_instruct-bench-float16-maxbs:64-input_output_len:500,2000-quant:fp8]
126+
130127

131128
- condition:
132129
terms:
@@ -214,10 +211,7 @@ llm_perf_nim:
214211
# torch backend
215212
- perf/test_perf.py::test_perf[mistral_7b_v0.1-bench-pytorch-float16-input_output_len:128,128]
216213
- perf/test_perf.py::test_perf[llama_v3.2_1b-bench-pytorch-bfloat16-input_output_len:128,128-gpus:2]
217-
#phi_3_mini_128k_instruct
218-
#trt backend
219-
- perf/test_perf.py::test_perf[phi_3_mini_128k_instruct-bench-float16-maxbs:128-input_output_len:1000,1000-quant:fp8-tp:2]
220-
- perf/test_perf.py::test_perf[phi_3_mini_128k_instruct-bench-float16-maxbs:128-input_output_len:500,2000-quant:fp8-tp:2]
214+
221215

222216
- condition:
223217
terms:

0 commit comments

Comments
 (0)