Commit Graph

60 Commits

Author SHA1 Message Date
hiyouga
b7ca6c8dc1 fix #5048 2024-08-05 23:48:19 +08:00
hiyouga
5665062ca0 tiny fix 2024-07-22 21:10:15 +08:00
hoshi-hiyouga
26082fc6c9 fix #4917 2024-07-22 11:28:31 +08:00
hiyouga
29ebcd75d5 fix up 2024-07-15 01:04:56 +08:00
hiyouga
d3c01552e0 tiny fix 2024-07-14 10:56:45 +08:00
hiyouga
2f6af73da2 fix gemma2 attention 2024-07-13 23:33:45 +08:00
hiyouga
0c699de39d tiny fix 2024-07-04 03:47:05 +08:00
hiyouga
6fd6aa4530 fix packing for eager/sdpa attn 2024-07-04 01:52:43 +08:00
hoshi-hiyouga
87d9b2d005 Merge pull request #4224 from chuan298/main
Implement efficient packing without cross-contamination attention
2024-07-04 01:18:54 +08:00
hiyouga
cce7083024 update packing 2024-07-04 01:10:55 +08:00
hoshi-hiyouga
a36e8f2dd5 Update packing.py 2024-07-03 23:36:01 +08:00
hiyouga
c346f79f99 update func name 2024-07-03 23:29:33 +08:00
hiyouga
8a6a7b9c8a update arg name 2024-07-03 23:23:24 +08:00
hiyouga
8b1172b910 tiny fix 2024-07-03 02:31:50 +08:00
ancv
e8e13b0942 move efficient_packing from data_args to model_args 2024-07-02 18:37:55 +07:00
hoshi-hiyouga
e8e6af2651 Merge branch 'main' into main 2024-07-01 21:01:09 +08:00
hiyouga
8c41a0aa6d tiny fix 2024-07-01 03:55:20 +08:00
hiyouga
d74244d568 fix #4398 #4592 2024-06-30 21:28:51 +08:00
hiyouga
2f4b89ace1 loose gemma2 attention 2024-06-29 01:42:14 +08:00
hiyouga
4d35e218b1 bf16 by default, gemma2 attns
Gemma2 finetuning cannot work until merging https://github.com/huggingface/transformers/pull/31674
2024-06-28 06:00:26 +08:00
hiyouga
96a5044394 add quant checks 2024-06-27 01:12:25 +08:00
hiyouga
ad144c2265 support HQQ/EETQ #4113 2024-06-27 00:29:42 +08:00
hiyouga
addca926de improve autogptq integration 2024-06-26 22:11:44 +08:00
hiyouga
1e9d0aa1e4 fix #4432 2024-06-25 02:34:04 +08:00
hiyouga
fca893d73c fix #4410 2024-06-24 22:34:31 +08:00
stceum
3ed063f281 Bug Fix: off is parsed as False in yaml file, changed to disabled to avoid this. 2024-06-24 20:39:31 +08:00
ancv
770f75dc83 move configure_packing to llamafactory.model.patcher and fix constants 2024-06-21 00:45:06 +07:00
hiyouga
8d4f5093cf tiny fix 2024-06-20 22:56:05 +08:00
hiyouga
3b040e8e0f update patcher 2024-06-19 21:27:00 +08:00
hiyouga
4bd77d8563 fix #4357 2024-06-18 22:42:45 +08:00
hiyouga
e2665e71c7 fix #4326 2024-06-17 18:17:48 +08:00
ancv
238f5c3d99 update packing with sdpa and eager attention mode 2024-06-16 02:25:47 +07:00
hiyouga
8c1046d78a support pissa 2024-06-16 01:08:12 +08:00
hiyouga
38b6b0f52e tiny fix 2024-06-16 01:06:41 +08:00
ancv
04315c3d92 remove some unused params 2024-06-15 23:00:55 +07:00
hiyouga
d87108daa6 add license 2024-06-15 17:54:33 +08:00
hiyouga
b27269bd2b add test cases 2024-06-15 04:05:54 +08:00
hiyouga
2ed8270112 clean code 2024-06-13 01:58:16 +08:00
ancv
b2c367bc61 implement efficient packing without cross-contamination attention 2024-06-12 11:56:01 +07:00
hiyouga
cca6f35108 fix deepspeed version 2024-06-11 16:52:36 +08:00
hiyouga
89f2bd8c8c fix #4198 2024-06-11 15:38:38 +08:00
hiyouga
3f24337a8a tiny fix 2024-06-11 01:04:16 +08:00
hiyouga
a793e8456b fix #4160
The split heads should be concatenated in dim=2
2024-06-11 00:37:17 +08:00
hiyouga
c907d81667 fix #2666 2024-06-10 21:24:15 +08:00
hiyouga
54cd743ebf reorganize adapter code 2024-06-08 00:47:23 +08:00
hoshi-hiyouga
cfd62283a9 fix #4139 2024-06-08 00:45:02 +08:00
hiyouga
06e5d136a4 add resume args in webui 2024-06-08 00:22:16 +08:00
hiyouga
74f96efef9 rename files 2024-06-07 00:09:06 +08:00
hiyouga
451b6693c0 fix torch gc 2024-06-06 20:30:25 +08:00
hiyouga
a12a506c3d support train from scratch #4033 #4075 2024-06-06 02:43:19 +08:00