Commit Graph

975 Commits

Author SHA1 Message Date
hoshi-hiyouga
5da097f406 Merge pull request #3651 from BUAADreamer/main
add some mllm features and try to incorporate Chinese-LLaVA-Med project
2024-05-11 23:59:08 +08:00
hoshi-hiyouga
5bfa8e4667 Update loader.py 2024-05-11 23:58:47 +08:00
hoshi-hiyouga
bb2e6b0ea3 Update model_args.py 2024-05-11 23:57:05 +08:00
hoshi-hiyouga
708aa5e098 Update patcher.py 2024-05-11 23:56:40 +08:00
hoshi-hiyouga
5f72439a1d Update tuner.py 2024-05-11 23:55:59 +08:00
hoshi-hiyouga
13851fb045 Update tuner.py 2024-05-11 23:54:53 +08:00
BUAADreamer
7944cbc576 Merge branch 'main' of https://github.com/BUAADreamer/LLaMA-Factory 2024-05-11 13:11:10 +08:00
BUAADreamer
7be7972f28 add full parameter finetuning of mllm 2024-05-11 13:11:00 +08:00
kkkl
b5c5c315a5 Update constants.py
Fix the download issue of the Phi3 model
2024-05-11 00:22:40 +08:00
BUAADreamer
508d474754 Merge branch 'hiyouga:main' into main 2024-05-10 20:34:41 +08:00
hiyouga
75aec4cf8e resolve python 3.8 package 2024-05-09 16:52:27 +08:00
BUAADreamer
8b997e32fb add push processor to hub 2024-05-09 14:05:19 +08:00
BUAADreamer
83f2f0de1d Merge branch 'hiyouga:main' into main 2024-05-09 13:45:43 +08:00
BUAADreamer
ef33856380 add mllm export 2024-05-08 22:50:42 +08:00
hiyouga
d9cdddd19c fix #3625 2024-05-08 17:12:56 +08:00
hiyouga
48ee46dac1 add llama3 chinese chat 2024-05-08 17:10:03 +08:00
hiyouga
10ab83f4c4 add deepseek moe 236B 2024-05-08 16:37:54 +08:00
BUAADreamer
0ca1d1967d modify export model 2024-05-08 10:36:36 +08:00
hiyouga
0f8f7d3b90 fix #3560 2024-05-07 19:03:35 +08:00
hiyouga
b0888262e3 fix #3602 2024-05-07 17:50:27 +08:00
hiyouga
09f3ef1de4 fix stop param 2024-05-07 00:41:04 +08:00
hoshi-hiyouga
bcf7ec5ceb Merge pull request #3527 from zhaonx/dev
"add support for vllm api stop parameter"
2024-05-07 00:37:49 +08:00
hoshi-hiyouga
17d0005b8c Update vllm_engine.py 2024-05-07 00:37:05 +08:00
hoshi-hiyouga
f32eefae3d Update generating_args.py 2024-05-07 00:28:16 +08:00
hoshi-hiyouga
7ae7ae64f0 Update generating_args.py 2024-05-07 00:27:56 +08:00
hiyouga
a153039380 fix gradio args 2024-05-06 23:33:06 +08:00
hiyouga
34d33e2257 update docs 2024-05-06 21:47:00 +08:00
zhaonx96
80645751bc ”add stop parameter in chat.py“ 2024-05-06 10:10:00 +08:00
zhaonx96
1abd55dd59 Merge branch 'main' of https://github.com/zhaonx/LLaMA-Factory into dev 2024-05-06 10:09:00 +08:00
hiyouga
bd095eeb73 add version and help to cli 2024-05-05 02:44:35 +08:00
hiyouga
af596988b1 update webui 2024-05-05 00:17:54 +08:00
hiyouga
e984ba3167 remove empty stream response 2024-05-04 16:13:52 +08:00
hiyouga
941924fdbd fix async stream api response 2024-05-04 16:11:18 +08:00
hiyouga
ed8f8be752 update api and support abort eval in webui 2024-05-04 15:59:15 +08:00
hiyouga
9d2ce57345 update readme and webui launch 2024-05-04 00:43:02 +08:00
hiyouga
24cc93ab15 fix eval in webui 2024-05-04 00:19:19 +08:00
hiyouga
510e64ee70 fix webui resume 2024-05-03 23:15:19 +08:00
hiyouga
3010154adb fix slow op in dpo/orpo trainer 2024-05-03 23:06:52 +08:00
hiyouga
9585838ebe fix callback log multigpu #3559 2024-05-03 21:24:27 +08:00
hiyouga
5e6f808e3c enable tqdm in webui 2024-05-03 04:42:50 +08:00
hiyouga
17d2e5147e fix gen_args 2024-05-03 04:24:50 +08:00
hiyouga
530f6b49bb fix colab gradio 2024-05-03 03:54:46 +08:00
hiyouga
245fe47ece update webui and add CLIs 2024-05-03 02:58:23 +08:00
hiyouga
9433c8c215 fix badam configs 2024-05-02 02:47:04 +08:00
hoshi-hiyouga
dcd53cb89a Update train.py 2024-05-02 02:21:27 +08:00
zhaonx
42edc81585 "add support for vllm api stop parameter" 2024-04-30 17:17:09 +08:00
codingma
26f7170393 support BAdam in WebUI 2024-04-28 11:31:34 +08:00
hiyouga
b3e33c703e fix llava rlhf 2024-04-28 03:01:49 +08:00
hiyouga
4dbbce21d5 add models to 0.7.0 2024-04-28 01:50:30 +08:00
hiyouga
168f56683a release v0.7.0 2024-04-26 23:18:00 +08:00