BUAADreamer
|
838eb87a96
|
merge model part to the text stream
|
2024-04-25 08:20:41 +08:00 |
|
BUAADreamer
|
7ffee90799
|
remove conflicts
|
2024-04-25 00:34:22 +08:00 |
|
BUAADreamer
|
cfb485eddf
|
add llava and instructblip
|
2024-04-25 00:22:43 +08:00 |
|
hiyouga
|
297fb8ead3
|
support new special token #3420
|
2024-04-24 23:39:31 +08:00 |
|
hiyouga
|
667ce08b27
|
remove redundant code
|
2024-04-24 05:02:18 +08:00 |
|
hiyouga
|
b1deb0a0b9
|
support unsloth generate
|
2024-04-24 04:46:53 +08:00 |
|
hiyouga
|
aa2b79eb23
|
refactor patcher
|
2024-04-24 03:02:23 +08:00 |
|
BUAADreamer
|
4dcb11eab7
|
add multimodal LLM BLIP-2 and InstructBLIP
|
2024-04-23 18:45:43 +08:00 |
|
hiyouga
|
f58425ab45
|
fix mod stuff
|
2024-04-21 18:11:10 +08:00 |
|
Marco
|
620add7b9f
|
Added Mixture of Depths
|
2024-04-18 20:31:24 +02:00 |
|
hoshi-hiyouga
|
750cdf2e74
|
Update adapter.py
|
2024-04-16 17:28:12 +08:00 |
|
Jonery
|
06c8908d3f
|
Feature BAdam
|
2024-04-15 23:15:27 +08:00 |
|
hiyouga
|
efc345c4b0
|
fix #3273
|
2024-04-15 15:32:58 +08:00 |
|
hiyouga
|
9d4c949461
|
release v0.6.2
|
2024-04-11 20:08:51 +08:00 |
|
hoshi-hiyouga
|
98bc97d8d2
|
Update adapter.py
|
2024-04-10 00:57:51 +08:00 |
|
hoshi-hiyouga
|
2111b586b6
|
Update adapter.py
|
2024-04-10 00:57:30 +08:00 |
|
Erich Schubert
|
b5eefe5c4c
|
Pass additional_target to unsloth
Fixes #3200
|
2024-04-09 17:53:40 +02:00 |
|
hiyouga
|
72367307df
|
improve lora+ impl.
|
2024-03-13 23:32:51 +08:00 |
|
hiyouga
|
b9f87cdc11
|
fix #2802
|
2024-03-13 12:33:45 +08:00 |
|
hiyouga
|
33a4c24a8a
|
fix galore
|
2024-03-08 00:44:51 +08:00 |
|
hiyouga
|
cfefacaa37
|
support DoRA, AWQ, AQLM #2512
|
2024-02-28 19:53:28 +08:00 |
|
hiyouga
|
9aeb404a94
|
support lora for llama pro
|
2024-02-21 02:17:22 +08:00 |
|
hiyouga
|
22acab8aff
|
fix #2481
|
2024-02-15 19:07:47 +08:00 |
|
hiyouga
|
7924ffc55d
|
support llama pro #2338 , add rslora
|
2024-02-15 02:27:36 +08:00 |
|
hiyouga
|
6545c02790
|
add hint for freeze #2412
|
2024-02-03 23:38:56 +08:00 |
|
hiyouga
|
638234ceee
|
format style
|
2024-01-20 20:15:56 +08:00 |
|
hiyouga
|
38af076a75
|
support longlora for main branch
|
2024-01-20 19:25:22 +08:00 |
|
hiyouga
|
b6ec112beb
|
add bf16 lora option
|
2024-01-19 16:29:03 +08:00 |
|
hiyouga
|
d9f1cae351
|
support function calling
|
2024-01-18 09:54:23 +08:00 |
|
hiyouga
|
33f2c0d4f8
|
fix #2081
|
2024-01-04 23:19:08 +08:00 |
|
hiyouga
|
7aad0b889d
|
support unsloth
|
2023-12-23 00:14:33 +08:00 |
|
hiyouga
|
0716f5e470
|
refactor adapter hparam
|
2023-12-15 20:53:11 +08:00 |
|
hiyouga
|
3a8a50d4d4
|
remove loftq
|
2023-12-13 01:53:46 +08:00 |
|
hiyouga
|
6219dfbd93
|
support loftq
|
2023-12-12 22:47:06 +08:00 |
|
hiyouga
|
d5b2c57a35
|
fix modelscope data hub
|
2023-12-12 18:33:06 +08:00 |
|
hiyouga
|
9ce1b0e2f2
|
use peft 0.7.0, fix #1561 #1764
|
2023-12-11 17:13:40 +08:00 |
|
hiyouga
|
f57445c7a0
|
fix gptq training
|
2023-12-02 00:27:15 +08:00 |
|
hiyouga
|
9ea9380145
|
support GPTQ tuning #729 #1481 #1545 , fix chatglm template #1453 #1480 #1569
|
2023-11-20 22:52:11 +08:00 |
|
hiyouga
|
ff52b1779c
|
fix bug in freeze tuning
|
2023-11-16 14:25:11 +08:00 |
|
hiyouga
|
ce78303600
|
support full-parameter PPO
|
2023-11-16 02:08:04 +08:00 |
|
hiyouga
|
4907452d95
|
support multiple modules in freeze training #1514
|
2023-11-15 17:08:18 +08:00 |
|
hiyouga
|
4736344eb1
|
disentangle model from tuner and rename modules
|
2023-11-15 16:29:09 +08:00 |
|