support GPTQ tuning #729 #1481 #1545 , fix chatglm template #1453 #1480 #1569

Former-commit-id: 9ea9380145
This commit is contained in:
hiyouga
2023-11-20 22:52:11 +08:00
parent f06c4c8f7a
commit 4966bd7911
5 changed files with 43 additions and 4 deletions

View File

@@ -324,7 +324,7 @@ CUDA_VISIBLE_DEVICES=0 python src/train_bash.py \
```
> [!WARNING]
> Use `--per_device_train_batch_size=1` for LLaMA-2 models in fp16 training.
> Use `--per_device_train_batch_size=1` for LLaMA-2 models in fp16 PPO training.
#### DPO Training