Commit Graph

1305 Commits

Author SHA1 Message Date
hoshi-hiyouga
a5b809516e Update loader.py 2024-07-15 00:50:06 +08:00
hoshi-hiyouga
3d39d74003 Update parser.py 2024-07-14 23:04:34 +08:00
codingma
76f3bbcfc0 1. add custom eval dataset support
2. merge load dataset and split dataset function
2024-07-05 15:52:10 +08:00
hiyouga
9f33f1edf5 fix processors 2024-07-05 08:33:22 +08:00
hiyouga
e43809bced fix #4683 2024-07-05 00:58:05 +08:00
hiyouga
ed232311e8 fix #4674 2024-07-05 00:41:03 +08:00
hiyouga
226a9e563f Merge branch 'main' of https://github.com/hiyouga/LLaMA-Factory 2024-07-04 14:23:37 +08:00
hiyouga
1e27e8c776 fix #4677 2024-07-04 14:22:07 +08:00
hzhaoy
738df47748 tiny fix 2024-07-04 10:20:28 +08:00
hiyouga
0c699de39d tiny fix 2024-07-04 03:47:05 +08:00
hiyouga
44747cebd2 tiny fix 2024-07-04 03:02:23 +08:00
hiyouga
b5d101e1bf fix data map for packing 2024-07-04 03:01:31 +08:00
hiyouga
6fd6aa4530 fix packing for eager/sdpa attn 2024-07-04 01:52:43 +08:00
hoshi-hiyouga
87d9b2d005 Merge pull request #4224 from chuan298/main
Implement efficient packing without cross-contamination attention
2024-07-04 01:18:54 +08:00
hiyouga
cce7083024 update packing 2024-07-04 01:10:55 +08:00
hoshi-hiyouga
a36e8f2dd5 Update packing.py 2024-07-03 23:36:01 +08:00
hiyouga
c346f79f99 update func name 2024-07-03 23:29:33 +08:00
hiyouga
8a6a7b9c8a update arg name 2024-07-03 23:23:24 +08:00
hiyouga
575a02a23d update hparams 2024-07-03 23:18:58 +08:00
hiyouga
7f770f6895 update ui 2024-07-03 23:13:49 +08:00
hiyouga
8845e94f91 fix #4609
unwrap_model_for_generation(reward_model) is necessary for zero3 training
2024-07-03 19:45:51 +08:00
hiyouga
8b1172b910 tiny fix 2024-07-03 02:31:50 +08:00
hiyouga
71cdf8956e tiny fix 2024-07-02 23:06:13 +08:00
hiyouga
821bb6660e remove rlhf support for chatglm2&3 2024-07-02 23:03:17 +08:00
hiyouga
c13ae2df19 upcast logits 2024-07-02 22:32:05 +08:00
hiyouga
c47ab6c072 improve rlhf 2024-07-02 22:23:08 +08:00
ancv
e8e13b0942 move efficient_packing from data_args to model_args 2024-07-02 18:37:55 +07:00
hoshi-hiyouga
4e4b3cc905 Merge pull request #4651 from hzhaoy/add-telechat-1b
Add TeleChat-1B
2024-07-02 17:56:43 +08:00
hzhaoy
57b7c00430 add TeleChat-1B 2024-07-02 17:49:04 +08:00
hiyouga
4c296001c4 fix ppo callbacks 2024-07-02 17:34:56 +08:00
hoshi-hiyouga
e8e6af2651 Merge branch 'main' into main 2024-07-01 21:01:09 +08:00
hiyouga
73280b7dc7 tiny fix 2024-07-01 05:43:17 +08:00
hiyouga
8c41a0aa6d tiny fix 2024-07-01 03:55:20 +08:00
hiyouga
1856a08e87 add eval acc 2024-07-01 03:51:20 +08:00
hiyouga
1771251ce3 fix #4402 #4617
Deprecate reserved_label_len arg
2024-07-01 01:19:27 +08:00
hiyouga
d74244d568 fix #4398 #4592 2024-06-30 21:28:51 +08:00
hiyouga
2f4b89ace1 loose gemma2 attention 2024-06-29 01:42:14 +08:00
hiyouga
4d35e218b1 bf16 by default, gemma2 attns
Gemma2 finetuning cannot work until merging https://github.com/huggingface/transformers/pull/31674
2024-06-28 06:00:26 +08:00
hiyouga
64f4337dac increase pissa_iter for stability 2024-06-28 03:18:54 +08:00
hiyouga
6f63050e1b add Gemma2 models 2024-06-28 01:26:50 +08:00
hiyouga
8baf3b22b0 refactor pissa, improve llamaboard 2024-06-28 01:04:24 +08:00
hoshi-hiyouga
ef38daa0a4 Merge pull request #4580 from hzhaoy/bugfix-deepspeed-pissa
Fix bug when using pissa method with deepspeed
2024-06-28 00:46:51 +08:00
hiyouga
8ed6b367e2 fix #4549 2024-06-28 00:41:58 +08:00
hiyouga
e44a4f07f0 tiny fix 2024-06-27 20:14:48 +08:00
faddddeout
f6b62f0070 Exit the process with the subprocess's return code when utilizing the CLI 2024-06-27 09:58:00 +00:00
hzhaoy
677c86594e fix #4579 2024-06-27 13:49:57 +08:00
hiyouga
96a5044394 add quant checks 2024-06-27 01:12:25 +08:00
hiyouga
f17c9dfd84 tiny fix 2024-06-27 00:46:41 +08:00
hiyouga
29c710da3a tiny fix 2024-06-27 00:36:04 +08:00
hiyouga
ad144c2265 support HQQ/EETQ #4113 2024-06-27 00:29:42 +08:00