Skip to content

Commit 237ae00

Browse files
authored
fix autotuning warmup length (#1028)
Co-authored-by: none <none>
1 parent e13e2a1 commit 237ae00

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

lightllm/common/basemodel/basemodel.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -736,7 +736,7 @@ def _autotune_warmup(self):
736736

737737
torch.distributed.barrier()
738738

739-
warmup_lengths = [1, 8, 16, 64, 128, 256, 1024, 2048, 4096]
739+
warmup_lengths = [1, 8, 16, 32, 64, 100, 128, 256, 1024, 2048, 4096]
740740

741741
if self.batch_max_tokens not in warmup_lengths:
742742
warmup_lengths.append(self.batch_max_tokens)

0 commit comments

Comments
 (0)