hiyouga
|
c27afa296b
|
fix #3702
|
2024-05-13 18:24:35 +08:00 |
|
hoshi-hiyouga
|
f7c8eddbc5
|
Merge pull request #3655 from Tendo33/main
1.Change the name of is_fastapi_available function 2. Added the log of printing requests when deploying using vllm
|
2024-05-13 18:05:50 +08:00 |
|
hiyouga
|
d12b8f866a
|
support Yi 1.5
|
2024-05-13 16:51:20 +08:00 |
|
Tendo33
|
b2bf7f5724
|
ruff check scripts src tests --fix
|
2024-05-13 09:40:33 +08:00 |
|
Sun Jinfeng
|
17cd57f914
|
Merge branch 'hiyouga:main' into main
|
2024-05-13 09:29:58 +08:00 |
|
hiyouga
|
482d412dd9
|
lint
|
2024-05-12 01:28:51 +08:00 |
|
hiyouga
|
4777efe517
|
fix #3658
|
2024-05-12 01:25:16 +08:00 |
|
hiyouga
|
58c522cd5c
|
remove checksum and fix ui args
|
2024-05-12 01:10:30 +08:00 |
|
hoshi-hiyouga
|
d06d56661b
|
Merge pull request #3654 from betapeanut/main
Remove Redundant Environment Variable Usage
|
2024-05-12 00:49:00 +08:00 |
|
hiyouga
|
56857770f8
|
fix #3674
|
2024-05-12 00:03:59 +08:00 |
|
hiyouga
|
b033232aea
|
fix llava config
|
2024-05-12 00:02:49 +08:00 |
|
hoshi-hiyouga
|
5da097f406
|
Merge pull request #3651 from BUAADreamer/main
add some mllm features and try to incorporate Chinese-LLaVA-Med project
|
2024-05-11 23:59:08 +08:00 |
|
hoshi-hiyouga
|
5bfa8e4667
|
Update loader.py
|
2024-05-11 23:58:47 +08:00 |
|
hoshi-hiyouga
|
bb2e6b0ea3
|
Update model_args.py
|
2024-05-11 23:57:05 +08:00 |
|
hoshi-hiyouga
|
708aa5e098
|
Update patcher.py
|
2024-05-11 23:56:40 +08:00 |
|
hoshi-hiyouga
|
5f72439a1d
|
Update tuner.py
|
2024-05-11 23:55:59 +08:00 |
|
hoshi-hiyouga
|
13851fb045
|
Update tuner.py
|
2024-05-11 23:54:53 +08:00 |
|
BUAADreamer
|
7944cbc576
|
Merge branch 'main' of https://github.com/BUAADreamer/LLaMA-Factory
|
2024-05-11 13:11:10 +08:00 |
|
BUAADreamer
|
7be7972f28
|
add full parameter finetuning of mllm
|
2024-05-11 13:11:00 +08:00 |
|
kkkl
|
b5c5c315a5
|
Update constants.py
Fix the download issue of the Phi3 model
|
2024-05-11 00:22:40 +08:00 |
|
BUAADreamer
|
508d474754
|
Merge branch 'hiyouga:main' into main
|
2024-05-10 20:34:41 +08:00 |
|
hiyouga
|
75aec4cf8e
|
resolve python 3.8 package
|
2024-05-09 16:52:27 +08:00 |
|
Tendo33
|
fd2e6dec58
|
1.Change the name of is_fastapi_available function
2. Added the log of printing requests when deploying using vllm
|
2024-05-09 14:28:01 +08:00 |
|
BUAADreamer
|
8b997e32fb
|
add push processor to hub
|
2024-05-09 14:05:19 +08:00 |
|
BUAADreamer
|
83f2f0de1d
|
Merge branch 'hiyouga:main' into main
|
2024-05-09 13:45:43 +08:00 |
|
cocktailpeanut
|
3c11157a49
|
yet another removal of unnecessary environment variables
|
2024-05-09 01:33:20 -04:00 |
|
cocktailpeanut
|
425b9d6166
|
more removal of unnecessary environment variables
|
2024-05-09 01:32:00 -04:00 |
|
cocktailpeanut
|
b783673e0a
|
remove unnecessary environment variable usage
|
2024-05-09 01:26:15 -04:00 |
|
BUAADreamer
|
ef33856380
|
add mllm export
|
2024-05-08 22:50:42 +08:00 |
|
hiyouga
|
d9cdddd19c
|
fix #3625
|
2024-05-08 17:12:56 +08:00 |
|
hiyouga
|
48ee46dac1
|
add llama3 chinese chat
|
2024-05-08 17:10:03 +08:00 |
|
hiyouga
|
10ab83f4c4
|
add deepseek moe 236B
|
2024-05-08 16:37:54 +08:00 |
|
BUAADreamer
|
0ca1d1967d
|
modify export model
|
2024-05-08 10:36:36 +08:00 |
|
hiyouga
|
0f8f7d3b90
|
fix #3560
|
2024-05-07 19:03:35 +08:00 |
|
hiyouga
|
b0888262e3
|
fix #3602
|
2024-05-07 17:50:27 +08:00 |
|
hiyouga
|
09f3ef1de4
|
fix stop param
|
2024-05-07 00:41:04 +08:00 |
|
hoshi-hiyouga
|
bcf7ec5ceb
|
Merge pull request #3527 from zhaonx/dev
"add support for vllm api stop parameter"
|
2024-05-07 00:37:49 +08:00 |
|
hoshi-hiyouga
|
17d0005b8c
|
Update vllm_engine.py
|
2024-05-07 00:37:05 +08:00 |
|
hoshi-hiyouga
|
f32eefae3d
|
Update generating_args.py
|
2024-05-07 00:28:16 +08:00 |
|
hoshi-hiyouga
|
7ae7ae64f0
|
Update generating_args.py
|
2024-05-07 00:27:56 +08:00 |
|
hiyouga
|
a153039380
|
fix gradio args
|
2024-05-06 23:33:06 +08:00 |
|
hiyouga
|
34d33e2257
|
update docs
|
2024-05-06 21:47:00 +08:00 |
|
zhaonx96
|
80645751bc
|
”add stop parameter in chat.py“
|
2024-05-06 10:10:00 +08:00 |
|
zhaonx96
|
1abd55dd59
|
Merge branch 'main' of https://github.com/zhaonx/LLaMA-Factory into dev
|
2024-05-06 10:09:00 +08:00 |
|
hiyouga
|
bd095eeb73
|
add version and help to cli
|
2024-05-05 02:44:35 +08:00 |
|
hiyouga
|
af596988b1
|
update webui
|
2024-05-05 00:17:54 +08:00 |
|
hiyouga
|
e984ba3167
|
remove empty stream response
|
2024-05-04 16:13:52 +08:00 |
|
hiyouga
|
941924fdbd
|
fix async stream api response
|
2024-05-04 16:11:18 +08:00 |
|
hiyouga
|
ed8f8be752
|
update api and support abort eval in webui
|
2024-05-04 15:59:15 +08:00 |
|
hiyouga
|
9d2ce57345
|
update readme and webui launch
|
2024-05-04 00:43:02 +08:00 |
|