Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions paddlenlp/trainer/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -3267,6 +3267,14 @@
else:
labels = None
inputs = inputs.pop("input_ids")
# consider no drop_last case
model_config_backup = model.micro_batch_size, model.accumulate_steps
if isinstance(inputs, tuple):
actual_batch_size = inputs[0].shape[0]
else:

Check warning on line 3274 in paddlenlp/trainer/trainer.py

View check run for this annotation

Codecov / codecov/patch

paddlenlp/trainer/trainer.py#L3272-L3274

Added lines #L3272 - L3274 were not covered by tests
actual_batch_size = inputs.shape[0]
model.micro_batch_size = 1
model.accumulate_steps = actual_batch_size

Check warning on line 3277 in paddlenlp/trainer/trainer.py

View check run for this annotation

Codecov / codecov/patch

paddlenlp/trainer/trainer.py#L3276-L3277

Added lines #L3276 - L3277 were not covered by tests

with paddle.no_grad():
if has_labels:
Expand All @@ -3276,6 +3284,7 @@
loss = loss.mean().detach()
else:
raise ValueError("pipeline mode eval need label!")
model.micro_batch_size, model.accumulate_steps = model_config_backup

Check warning on line 3287 in paddlenlp/trainer/trainer.py

View check run for this annotation

Codecov / codecov/patch

paddlenlp/trainer/trainer.py#L3287

Added line #L3287 was not covered by tests

return (loss, None, labels)

Expand Down
Loading