hiyouga
|
223bbd9930
|
resolve python 3.8 package
Former-commit-id: 5eee4ec7016846356715a4fa1ad58e3cbb1cac6e
|
2024-05-09 16:52:27 +08:00 |
|
Tendo33
|
9dadff90bb
|
1.Change the name of is_fastapi_available function
2. Added the log of printing requests when deploying using vllm
Former-commit-id: 530d4f5d51c13c71d99de5fe2d23805b0aa875a2
|
2024-05-09 14:28:01 +08:00 |
|
BUAADreamer
|
827a929f1d
|
add push processor to hub
Former-commit-id: 7a05a965311edfdfafa57af8342875860d341f27
|
2024-05-09 14:05:19 +08:00 |
|
BUAADreamer
|
47892418ad
|
Merge branch 'hiyouga:main' into main
Former-commit-id: 1f3163509ecd05902ea216a905b4ca15ddd3696f
|
2024-05-09 13:45:43 +08:00 |
|
cocktailpeanut
|
2aeae4b88b
|
yet another removal of unnecessary environment variables
Former-commit-id: a07726028f0287de28e4751672b27efe0efc6477
|
2024-05-09 01:33:20 -04:00 |
|
cocktailpeanut
|
c213f2a9a9
|
more removal of unnecessary environment variables
Former-commit-id: 59ef1a6e0d81585a6c010143d05fcfae26d40c00
|
2024-05-09 01:32:00 -04:00 |
|
cocktailpeanut
|
333f4a69bb
|
remove unnecessary environment variable usage
Former-commit-id: 4be1d832cb269a07987f5cab5d5f949e269087da
|
2024-05-09 01:26:15 -04:00 |
|
BUAADreamer
|
172600d432
|
add mllm export
Former-commit-id: ce4770d33f6761d3b1d60661efcb0be34a036154
|
2024-05-08 22:50:42 +08:00 |
|
hiyouga
|
4ce4172c87
|
fix #3625
Former-commit-id: 8c0f5d1db29862277d84aa128b424b7d0f2b187f
|
2024-05-08 17:12:56 +08:00 |
|
hiyouga
|
400ae144a4
|
add llama3 chinese chat
Former-commit-id: ee3e5920f2f28567259693cb106e884a90cb02a2
|
2024-05-08 17:10:03 +08:00 |
|
hiyouga
|
0a1b6ca5a7
|
add deepseek moe 236B
Former-commit-id: 30c10e2dc41b5d64191a91ad2d61f3b5c440b1d5
|
2024-05-08 16:37:54 +08:00 |
|
BUAADreamer
|
05ef89cfcc
|
modify export model
Former-commit-id: c7051edae4ce23f85daf204a2aaac134b1f29c3d
|
2024-05-08 10:36:36 +08:00 |
|
hiyouga
|
f6ac3796ca
|
fix #3560
Former-commit-id: ea69cbe903a301df1bcc4b63cdc5bd4c6e3a8255
|
2024-05-07 19:03:35 +08:00 |
|
hiyouga
|
ebab655683
|
fix #3602
Former-commit-id: 1518b45490606ea200482da4737113c46985e8c5
|
2024-05-07 17:50:27 +08:00 |
|
hiyouga
|
e3b3a722de
|
fix stop param
Former-commit-id: f0a850c25211b72eddbb357c81679db9b0930d44
|
2024-05-07 00:41:04 +08:00 |
|
hoshi-hiyouga
|
b9e167e6ca
|
Merge pull request #3527 from zhaonx/dev
"add support for vllm api stop parameter"
Former-commit-id: e7d436403af6ac4c6a33cf36411098a0b0fefce2
|
2024-05-07 00:37:49 +08:00 |
|
hoshi-hiyouga
|
1ebd1e50e7
|
Update vllm_engine.py
Former-commit-id: fa2410de07150a82082ab5b88baf56aa891db870
|
2024-05-07 00:37:05 +08:00 |
|
hoshi-hiyouga
|
14316f6583
|
Update generating_args.py
Former-commit-id: 714957ba0159919a89fc1659a7a7b4b6bd82eead
|
2024-05-07 00:28:16 +08:00 |
|
hoshi-hiyouga
|
8e4ab2f7d0
|
Update generating_args.py
Former-commit-id: 7a9fb56786f4c40856211009656a983be1e42cb7
|
2024-05-07 00:27:56 +08:00 |
|
hiyouga
|
da2295f8c8
|
fix gradio args
Former-commit-id: 7767c1ad4b2b638b558f941ba1f0d05d4a049507
|
2024-05-06 23:33:06 +08:00 |
|
hiyouga
|
5c9da798b5
|
update docs
Former-commit-id: a4a2e94241bea6f96590f6cb8ca8b5cddee1917e
|
2024-05-06 21:47:00 +08:00 |
|
zhouwei
|
3d1b0e1864
|
The training efficiency of the Ascend 910A has been significantly enhanced, leveraging the full computational power of the NPU (Neural Processing Unit) and the capabilities of torch_npu, a PyTorch library optimized for NPUs. This improvement has resulted in a remarkable tenfold increase in efficiency.
Former-commit-id: 90980b626d3408b3e2ee32a02456c20881318be7
|
2024-05-06 13:29:59 +08:00 |
|
zhaonx96
|
45becd2a45
|
”add stop parameter in chat.py“
Former-commit-id: e529bf5bc14c72558d26f73c42076eaa9684205c
|
2024-05-06 10:10:00 +08:00 |
|
zhaonx96
|
8f1197de7e
|
Merge branch 'main' of https://github.com/zhaonx/LLaMA-Factory into dev
Former-commit-id: ec1f834905e241277fdd3f764c70eede97e9ff40
|
2024-05-06 10:09:00 +08:00 |
|
hiyouga
|
4674f3baa7
|
add version and help to cli
Former-commit-id: f762f2215169b9fe55564d5600b758ddc66f9c9c
|
2024-05-05 02:44:35 +08:00 |
|
hiyouga
|
7ef3788ff4
|
update webui
Former-commit-id: 17a53d25cdadd2df70a8afa0488f75bbf1918b89
|
2024-05-05 00:17:54 +08:00 |
|
hiyouga
|
e9fe8815be
|
remove empty stream response
Former-commit-id: 070d0da928b1e974a094279a2782201016d2a3ab
|
2024-05-04 16:13:52 +08:00 |
|
hiyouga
|
9381fecca7
|
fix async stream api response
Former-commit-id: d70bbcae6513e50aa6094f2d98c4aa5c6641ea02
|
2024-05-04 16:11:18 +08:00 |
|
hiyouga
|
efa9140577
|
update api and support abort eval in webui
Former-commit-id: 8661bed68812e9ded9439e8a821b1d7716bc797b
|
2024-05-04 15:59:15 +08:00 |
|
hiyouga
|
37bcbf72b4
|
update readme and webui launch
Former-commit-id: c66ffa57323ef6ea78a9b75ec5122d9ea25fd420
|
2024-05-04 00:43:02 +08:00 |
|
hiyouga
|
182b974786
|
fix eval in webui
Former-commit-id: 774ef2bf5823d68b9cc254a676f5adb4af533d75
|
2024-05-04 00:19:19 +08:00 |
|
hiyouga
|
7a4a6a5522
|
fix webui resume
Former-commit-id: c2f6582ddd365bb64b72e8057cc4ecd7884d2480
|
2024-05-03 23:15:19 +08:00 |
|
hiyouga
|
2383e5440c
|
fix slow op in dpo/orpo trainer
Former-commit-id: 38cad0896ea0516de6d4b2759ec9d45ee67d339b
|
2024-05-03 23:06:52 +08:00 |
|
hiyouga
|
1fea91736a
|
fix callback log multigpu #3559
Former-commit-id: 1f105f1551b12675ca7d339ef5f91333f0371987
|
2024-05-03 21:24:27 +08:00 |
|
hiyouga
|
09d9fb28f9
|
enable tqdm in webui
Former-commit-id: 1737bff64799047a5b715fd979b4c038ae213bb3
|
2024-05-03 04:42:50 +08:00 |
|
hiyouga
|
57c6eabf83
|
fix gen_args
Former-commit-id: c3e2f4f07b7fb3b1d7d2b44451660f082a467aed
|
2024-05-03 04:24:50 +08:00 |
|
hiyouga
|
33d440b577
|
fix colab gradio
Former-commit-id: 26179a29d3400d1fea155e325a79473a8bc12f04
|
2024-05-03 03:54:46 +08:00 |
|
hiyouga
|
ce8200ad98
|
update webui and add CLIs
Former-commit-id: 1368dda22ab875914c9dd86ee5146a4f6a4736ad
|
2024-05-03 02:58:23 +08:00 |
|
hiyouga
|
dd0b85580e
|
fix badam configs
Former-commit-id: 8a4e6a4c65a9a42e6501b0d3ce81d6220c287454
|
2024-05-02 02:47:04 +08:00 |
|
hoshi-hiyouga
|
a11a04a24f
|
Update train.py
Former-commit-id: 16f0d0056967872e02969fdd842a381f9484af8a
|
2024-05-02 02:21:27 +08:00 |
|
zhaonx
|
2d95127c33
|
"add support for vllm api stop parameter"
Former-commit-id: b9f21fa639b66db09c79404d885661c96bdf9395
|
2024-04-30 17:17:09 +08:00 |
|
codingma
|
7641a214d8
|
support BAdam in WebUI
Former-commit-id: 1247154dd7d5eba5d11c4bb8504bf551ab49eb72
|
2024-04-28 11:31:34 +08:00 |
|
hiyouga
|
4dcd47100d
|
fix llava rlhf
Former-commit-id: f6863cbbcbf960d6481296c6cae3e40fd70e4e14
|
2024-04-28 03:01:49 +08:00 |
|
hiyouga
|
a412b4ed4a
|
add models to 0.7.0
Former-commit-id: 436d3754452f839c617839ab3bbaacc4a8908e19
|
2024-04-28 01:50:30 +08:00 |
|
hiyouga
|
c501f377dd
|
release v0.7.0
Former-commit-id: 45bb89cb4d26a6b3fb5360bc90ab950738fe4920
|
2024-04-26 23:18:00 +08:00 |
|
hiyouga
|
70bed8ad8f
|
support Qwen1.5 110B
Former-commit-id: d6e5ecaf4109127bab24e39a0696076bceb0b37c
|
2024-04-26 19:59:22 +08:00 |
|
hiyouga
|
51f776ae2a
|
fix llava qlora
Former-commit-id: 01c5a669f6fe598aac1758a700a7607da37db1bc
|
2024-04-26 18:00:23 +08:00 |
|
hiyouga
|
697bc20941
|
add llava to llamaboard
Former-commit-id: deaaff0a9de0eef9691991c99cd797461b1165cc
|
2024-04-26 06:41:35 +08:00 |
|
hiyouga
|
7773ac0ead
|
update readme
Former-commit-id: 41728fd74de7bec0cc6135aef9dfa3ae9fe7af73
|
2024-04-26 05:44:30 +08:00 |
|
hiyouga
|
23b881bff1
|
support mllm hf inference
Former-commit-id: 2c7c01282acd7ddabbb17ce3246b8dae4bc4b8cf
|
2024-04-26 05:34:58 +08:00 |
|