hiyouga
|
5d956e2a51
|
fix chat engine, update webui
|
2024-03-08 03:01:53 +08:00 |
|
hiyouga
|
0ac6b40a47
|
update galore args
|
2024-03-08 01:17:32 +08:00 |
|
hiyouga
|
33a4c24a8a
|
fix galore
|
2024-03-08 00:44:51 +08:00 |
|
hiyouga
|
28f7862188
|
support galore
|
2024-03-07 22:41:36 +08:00 |
|
hiyouga
|
d07ad5cc1c
|
support vllm
|
2024-03-07 20:26:31 +08:00 |
|
hiyouga
|
3016e65657
|
fix version checking
|
2024-03-06 14:51:51 +08:00 |
|
hiyouga
|
cfefacaa37
|
support DoRA, AWQ, AQLM #2512
|
2024-02-28 19:53:28 +08:00 |
|
hiyouga
|
3cc10a01a7
|
fix #2532
|
2024-02-21 21:55:14 +08:00 |
|
hiyouga
|
9aeb404a94
|
support lora for llama pro
|
2024-02-21 02:17:22 +08:00 |
|
hiyouga
|
ba998c67ab
|
update webui
|
2024-02-19 16:49:58 +08:00 |
|
hiyouga
|
7924ffc55d
|
support llama pro #2338 , add rslora
|
2024-02-15 02:27:36 +08:00 |
|
hiyouga
|
91d09a01ac
|
add option to disable version check
|
2024-02-10 22:31:23 +08:00 |
|
hiyouga
|
88a1bc9773
|
lint
|
2024-02-07 01:10:04 +08:00 |
|
hiyouga
|
6545c02790
|
add hint for freeze #2412
|
2024-02-03 23:38:56 +08:00 |
|
hiyouga
|
638234ceee
|
format style
|
2024-01-20 20:15:56 +08:00 |
|
hiyouga
|
b6ec112beb
|
add bf16 lora option
|
2024-01-19 16:29:03 +08:00 |
|
hiyouga
|
a83fb6d3ff
|
fix #2195
|
2024-01-16 23:53:50 +08:00 |
|
hiyouga
|
6629087e12
|
update loader
|
2023-12-24 19:10:23 +08:00 |
|
hiyouga
|
7aad0b889d
|
support unsloth
|
2023-12-23 00:14:33 +08:00 |
|
hiyouga
|
ba69378841
|
fix param type
|
2023-12-21 17:33:01 +08:00 |
|
hiyouga
|
b87c74289d
|
support dpo-ftx
|
2023-12-16 19:21:41 +08:00 |
|
hiyouga
|
3524aa1e58
|
support quantization in export model
|
2023-12-15 23:44:50 +08:00 |
|
hiyouga
|
0716f5e470
|
refactor adapter hparam
|
2023-12-15 20:53:11 +08:00 |
|
hiyouga
|
3a8a50d4d4
|
remove loftq
|
2023-12-13 01:53:46 +08:00 |
|
hiyouga
|
6219dfbd93
|
support loftq
|
2023-12-12 22:47:06 +08:00 |
|
hiyouga
|
8cace77808
|
update readme
|
2023-12-12 11:44:30 +08:00 |
|
hiyouga
|
7df4f3ab20
|
implement rm server #1543
|
2023-12-03 20:52:54 +08:00 |
|
hiyouga
|
475a3fa0f4
|
fix #1659
|
2023-11-28 20:52:28 +08:00 |
|
hiyouga
|
859a6ea942
|
support export size setting
|
2023-11-26 18:34:09 +08:00 |
|
hiyouga
|
5021062493
|
update ppo trainer
|
2023-11-20 21:39:15 +08:00 |
|
Yuchen Han
|
b24635d22b
|
Update finetuning_args.py
|
2023-11-17 00:15:51 -08:00 |
|
hiyouga
|
1817ffc86f
|
fix rlhf callback
|
2023-11-16 03:26:19 +08:00 |
|
hiyouga
|
856522a3df
|
fix bug in PPO training
|
2023-11-16 02:32:54 +08:00 |
|
hiyouga
|
ce78303600
|
support full-parameter PPO
|
2023-11-16 02:08:04 +08:00 |
|
hiyouga
|
4907452d95
|
support multiple modules in freeze training #1514
|
2023-11-15 17:08:18 +08:00 |
|
hiyouga
|
442aefb925
|
refactor evaluation, upgrade trl to 074
|
2023-11-13 22:20:35 +08:00 |
|
hiyouga
|
415bca900e
|
tiny fix
|
2023-11-09 17:20:49 +08:00 |
|
Yanqing
|
3684dffa14
|
Update finetuning_args.py
更新 chatglm/falcon/bloom 的 lora_target 的名称
|
2023-11-09 17:04:40 +08:00 |
|
hiyouga
|
01260d9754
|
fix ppo train and dpo eval
|
2023-11-07 22:48:51 +08:00 |
|
hiyouga
|
b2a60905f3
|
upgrade peft, fix #1088 #1411
|
2023-11-07 16:13:36 +08:00 |
|
hiyouga
|
7b4acf7265
|
reimplement neftune
|
2023-10-22 16:15:08 +08:00 |
|
anvie
|
57fb40aa04
|
add NEFTune optimization
|
2023-10-21 13:24:10 +07:00 |
|
hiyouga
|
ea82f8a82a
|
refactor export, fix #1190
|
2023-10-15 16:01:48 +08:00 |
|
hiyouga
|
11bd271364
|
fix ppo args
|
2023-10-11 23:40:50 +08:00 |
|
hiyouga
|
620efe1d8d
|
refactor finetuning Args
|
2023-09-27 22:28:06 +08:00 |
|
hiyouga
|
4318347d3f
|
update template
|
2023-08-22 19:46:09 +08:00 |
|
hiyouga
|
53e33418d0
|
support ppo score norm (trl 0.5.1.dev required)
|
2023-08-18 12:02:42 +08:00 |
|
hiyouga
|
9020524418
|
fix PPO trainer #551 , update readme
|
2023-08-18 11:43:10 +08:00 |
|
hiyouga
|
a48cb0d474
|
Release v0.1.6
|
2023-08-11 23:25:57 +08:00 |
|
hiyouga
|
3ec4351cfd
|
support DPO training (2305.18290)
|
2023-08-11 03:02:53 +08:00 |
|