hoshi-hiyouga
|
8e0b2a5e6f
|
Update tuner.py
Former-commit-id: ccd1eb2c0992f75440c0e1c5cd3f02d03aacb085
|
2024-05-11 23:55:59 +08:00 |
|
hoshi-hiyouga
|
a2a23451e9
|
Update tuner.py
Former-commit-id: 22afcbdb25160583e5ece28fad0585c7bc70f41a
|
2024-05-11 23:54:53 +08:00 |
|
BUAADreamer
|
8ddf13491f
|
Merge branch 'main' of https://github.com/BUAADreamer/LLaMA-Factory
Former-commit-id: 50cc5cf93d50c42cfcf5047bcd9b5c7959d503ae
|
2024-05-11 13:11:10 +08:00 |
|
BUAADreamer
|
e0a3128460
|
add full parameter finetuning of mllm
Former-commit-id: f90c1da5636ac3cb8112c5081a3b56b09a17fcf8
|
2024-05-11 13:11:00 +08:00 |
|
kkkl
|
7fc8921111
|
Update constants.py
Fix the download issue of the Phi3 model
Former-commit-id: 8978e80914ac6db1ed1b79641b20c84087dd4341
|
2024-05-11 00:22:40 +08:00 |
|
BUAADreamer
|
8be76c503c
|
Merge branch 'hiyouga:main' into main
Former-commit-id: 0dd072703508f68fd4ee51b6648d0c7642a4cc93
|
2024-05-10 20:34:41 +08:00 |
|
hiyouga
|
0aa218d39a
|
resolve python 3.8 package
Former-commit-id: 5eee4ec7016846356715a4fa1ad58e3cbb1cac6e
|
2024-05-09 16:52:27 +08:00 |
|
Tendo33
|
8b3a1cbb1c
|
1.Change the name of is_fastapi_available function
2. Added the log of printing requests when deploying using vllm
Former-commit-id: 530d4f5d51c13c71d99de5fe2d23805b0aa875a2
|
2024-05-09 14:28:01 +08:00 |
|
BUAADreamer
|
1a01dc288b
|
add push processor to hub
Former-commit-id: 7a05a965311edfdfafa57af8342875860d341f27
|
2024-05-09 14:05:19 +08:00 |
|
BUAADreamer
|
18a17e2a74
|
Merge branch 'hiyouga:main' into main
Former-commit-id: 1f3163509ecd05902ea216a905b4ca15ddd3696f
|
2024-05-09 13:45:43 +08:00 |
|
cocktailpeanut
|
6937ea9079
|
yet another removal of unnecessary environment variables
Former-commit-id: a07726028f0287de28e4751672b27efe0efc6477
|
2024-05-09 01:33:20 -04:00 |
|
cocktailpeanut
|
e7275a75f7
|
more removal of unnecessary environment variables
Former-commit-id: 59ef1a6e0d81585a6c010143d05fcfae26d40c00
|
2024-05-09 01:32:00 -04:00 |
|
cocktailpeanut
|
4fc4b26bb0
|
remove unnecessary environment variable usage
Former-commit-id: 4be1d832cb269a07987f5cab5d5f949e269087da
|
2024-05-09 01:26:15 -04:00 |
|
BUAADreamer
|
c5cfe458e8
|
add mllm export
Former-commit-id: ce4770d33f6761d3b1d60661efcb0be34a036154
|
2024-05-08 22:50:42 +08:00 |
|
hiyouga
|
36f118b66f
|
fix #3625
Former-commit-id: 8c0f5d1db29862277d84aa128b424b7d0f2b187f
|
2024-05-08 17:12:56 +08:00 |
|
hiyouga
|
8c691c003d
|
add llama3 chinese chat
Former-commit-id: ee3e5920f2f28567259693cb106e884a90cb02a2
|
2024-05-08 17:10:03 +08:00 |
|
hiyouga
|
432245fb0c
|
add deepseek moe 236B
Former-commit-id: 30c10e2dc41b5d64191a91ad2d61f3b5c440b1d5
|
2024-05-08 16:37:54 +08:00 |
|
BUAADreamer
|
042e41b16b
|
modify export model
Former-commit-id: c7051edae4ce23f85daf204a2aaac134b1f29c3d
|
2024-05-08 10:36:36 +08:00 |
|
hiyouga
|
9f2ea1c4cf
|
fix #3560
Former-commit-id: ea69cbe903a301df1bcc4b63cdc5bd4c6e3a8255
|
2024-05-07 19:03:35 +08:00 |
|
hiyouga
|
913f22bf09
|
fix #3602
Former-commit-id: 1518b45490606ea200482da4737113c46985e8c5
|
2024-05-07 17:50:27 +08:00 |
|
hiyouga
|
a978b5dc4e
|
fix stop param
Former-commit-id: f0a850c25211b72eddbb357c81679db9b0930d44
|
2024-05-07 00:41:04 +08:00 |
|
hoshi-hiyouga
|
cffc848e4f
|
Merge pull request #3527 from zhaonx/dev
"add support for vllm api stop parameter"
Former-commit-id: e7d436403af6ac4c6a33cf36411098a0b0fefce2
|
2024-05-07 00:37:49 +08:00 |
|
hoshi-hiyouga
|
d239e213e0
|
Update vllm_engine.py
Former-commit-id: fa2410de07150a82082ab5b88baf56aa891db870
|
2024-05-07 00:37:05 +08:00 |
|
hoshi-hiyouga
|
619d22d03d
|
Update generating_args.py
Former-commit-id: 714957ba0159919a89fc1659a7a7b4b6bd82eead
|
2024-05-07 00:28:16 +08:00 |
|
hoshi-hiyouga
|
2a525dcc5e
|
Update generating_args.py
Former-commit-id: 7a9fb56786f4c40856211009656a983be1e42cb7
|
2024-05-07 00:27:56 +08:00 |
|
hiyouga
|
02a88bcd78
|
fix gradio args
Former-commit-id: 7767c1ad4b2b638b558f941ba1f0d05d4a049507
|
2024-05-06 23:33:06 +08:00 |
|
hiyouga
|
ce048fadfe
|
update docs
Former-commit-id: a4a2e94241bea6f96590f6cb8ca8b5cddee1917e
|
2024-05-06 21:47:00 +08:00 |
|
zhouwei
|
ed5dd3b8c5
|
The training efficiency of the Ascend 910A has been significantly enhanced, leveraging the full computational power of the NPU (Neural Processing Unit) and the capabilities of torch_npu, a PyTorch library optimized for NPUs. This improvement has resulted in a remarkable tenfold increase in efficiency.
Former-commit-id: 90980b626d3408b3e2ee32a02456c20881318be7
|
2024-05-06 13:29:59 +08:00 |
|
zhaonx96
|
8b9b8f5c28
|
”add stop parameter in chat.py“
Former-commit-id: e529bf5bc14c72558d26f73c42076eaa9684205c
|
2024-05-06 10:10:00 +08:00 |
|
zhaonx96
|
63056386d5
|
Merge branch 'main' of https://github.com/zhaonx/LLaMA-Factory into dev
Former-commit-id: ec1f834905e241277fdd3f764c70eede97e9ff40
|
2024-05-06 10:09:00 +08:00 |
|
hiyouga
|
1b9308d21e
|
add version and help to cli
Former-commit-id: f762f2215169b9fe55564d5600b758ddc66f9c9c
|
2024-05-05 02:44:35 +08:00 |
|
hiyouga
|
6f6d11e66c
|
update webui
Former-commit-id: 17a53d25cdadd2df70a8afa0488f75bbf1918b89
|
2024-05-05 00:17:54 +08:00 |
|
hiyouga
|
33c2ab3bd1
|
remove empty stream response
Former-commit-id: 070d0da928b1e974a094279a2782201016d2a3ab
|
2024-05-04 16:13:52 +08:00 |
|
hiyouga
|
d59caef516
|
fix async stream api response
Former-commit-id: d70bbcae6513e50aa6094f2d98c4aa5c6641ea02
|
2024-05-04 16:11:18 +08:00 |
|
hiyouga
|
d2cdddd11e
|
update api and support abort eval in webui
Former-commit-id: 8661bed68812e9ded9439e8a821b1d7716bc797b
|
2024-05-04 15:59:15 +08:00 |
|
hiyouga
|
62953504ef
|
update readme and webui launch
Former-commit-id: c66ffa57323ef6ea78a9b75ec5122d9ea25fd420
|
2024-05-04 00:43:02 +08:00 |
|
hiyouga
|
8bbafa17bd
|
fix eval in webui
Former-commit-id: 774ef2bf5823d68b9cc254a676f5adb4af533d75
|
2024-05-04 00:19:19 +08:00 |
|
hiyouga
|
b62a3a8e5d
|
fix webui resume
Former-commit-id: c2f6582ddd365bb64b72e8057cc4ecd7884d2480
|
2024-05-03 23:15:19 +08:00 |
|
hiyouga
|
b3a4dcb2dc
|
fix slow op in dpo/orpo trainer
Former-commit-id: 38cad0896ea0516de6d4b2759ec9d45ee67d339b
|
2024-05-03 23:06:52 +08:00 |
|
hiyouga
|
4dfe10982d
|
fix callback log multigpu #3559
Former-commit-id: 1f105f1551b12675ca7d339ef5f91333f0371987
|
2024-05-03 21:24:27 +08:00 |
|
hiyouga
|
689912d32c
|
enable tqdm in webui
Former-commit-id: 1737bff64799047a5b715fd979b4c038ae213bb3
|
2024-05-03 04:42:50 +08:00 |
|
hiyouga
|
a340785c6a
|
fix gen_args
Former-commit-id: c3e2f4f07b7fb3b1d7d2b44451660f082a467aed
|
2024-05-03 04:24:50 +08:00 |
|
hiyouga
|
5fd6199ef8
|
fix colab gradio
Former-commit-id: 26179a29d3400d1fea155e325a79473a8bc12f04
|
2024-05-03 03:54:46 +08:00 |
|
hiyouga
|
419a3849f2
|
update webui and add CLIs
Former-commit-id: 1368dda22ab875914c9dd86ee5146a4f6a4736ad
|
2024-05-03 02:58:23 +08:00 |
|
hiyouga
|
c5c52dfc50
|
fix badam configs
Former-commit-id: 8a4e6a4c65a9a42e6501b0d3ce81d6220c287454
|
2024-05-02 02:47:04 +08:00 |
|
hoshi-hiyouga
|
3aae4df30c
|
Update train.py
Former-commit-id: 16f0d0056967872e02969fdd842a381f9484af8a
|
2024-05-02 02:21:27 +08:00 |
|
zhaonx
|
e22af297ee
|
"add support for vllm api stop parameter"
Former-commit-id: b9f21fa639b66db09c79404d885661c96bdf9395
|
2024-04-30 17:17:09 +08:00 |
|
codingma
|
812aceb1d5
|
support BAdam in WebUI
Former-commit-id: 1247154dd7d5eba5d11c4bb8504bf551ab49eb72
|
2024-04-28 11:31:34 +08:00 |
|
hiyouga
|
c52cab3e40
|
fix llava rlhf
Former-commit-id: f6863cbbcbf960d6481296c6cae3e40fd70e4e14
|
2024-04-28 03:01:49 +08:00 |
|
hiyouga
|
d87d442499
|
add models to 0.7.0
Former-commit-id: 436d3754452f839c617839ab3bbaacc4a8908e19
|
2024-04-28 01:50:30 +08:00 |
|