Sun Jinfeng
|
f94b54b776
|
Merge branch 'hiyouga:main' into main
Former-commit-id: 014acaa7845b7ac2876596d216b1be369a8e9311
|
2024-05-13 09:29:58 +08:00 |
|
hiyouga
|
1e1b8899f5
|
lint
Former-commit-id: cb72eb6ab24615ce492ca2945f29daa34c0c52d4
|
2024-05-12 01:28:51 +08:00 |
|
hiyouga
|
7b02c83399
|
fix #3658
Former-commit-id: 37799a62d4431d1d8c02fee6c23d607a65723c1a
|
2024-05-12 01:25:16 +08:00 |
|
hiyouga
|
8f1ba07b30
|
remove checksum and fix ui args
Former-commit-id: 0cfdeb1d30efb63211434bc4656bceb59e666289
|
2024-05-12 01:10:30 +08:00 |
|
hoshi-hiyouga
|
1ce400bddf
|
Merge pull request #3654 from betapeanut/main
Remove Redundant Environment Variable Usage
Former-commit-id: aa57a2a183eef822973d7e5d7c7bc80a42167482
|
2024-05-12 00:49:00 +08:00 |
|
hiyouga
|
25d316b1a0
|
fix #3674
Former-commit-id: 6bad2eafef75ec697477e1f2ce739006042fb4c7
|
2024-05-12 00:03:59 +08:00 |
|
hiyouga
|
2bcd5b2b73
|
fix llava config
Former-commit-id: b13d032325e45d401a9dbc64d4c73e308eff3288
|
2024-05-12 00:02:49 +08:00 |
|
hoshi-hiyouga
|
436afcba57
|
Merge pull request #3651 from BUAADreamer/main
add some mllm features and try to incorporate Chinese-LLaVA-Med project
Former-commit-id: 143d311d4a82e1fa9b6d4ad98b0db5b02f3572c4
|
2024-05-11 23:59:08 +08:00 |
|
hoshi-hiyouga
|
db47c53486
|
Update loader.py
Former-commit-id: 2fc12790414677bb82736208fb9547640780af2e
|
2024-05-11 23:58:47 +08:00 |
|
hoshi-hiyouga
|
4efe56fd68
|
Update model_args.py
Former-commit-id: c4114add4c42c1d7723f7270451a6c9fc656ecd1
|
2024-05-11 23:57:05 +08:00 |
|
hoshi-hiyouga
|
d54313fcf9
|
Update patcher.py
Former-commit-id: 2c88d394d29c6e98ac3a6860848855722614ca52
|
2024-05-11 23:56:40 +08:00 |
|
hoshi-hiyouga
|
382f096475
|
Update tuner.py
Former-commit-id: ccd1eb2c0992f75440c0e1c5cd3f02d03aacb085
|
2024-05-11 23:55:59 +08:00 |
|
hoshi-hiyouga
|
0ccc76392e
|
Update tuner.py
Former-commit-id: 22afcbdb25160583e5ece28fad0585c7bc70f41a
|
2024-05-11 23:54:53 +08:00 |
|
BUAADreamer
|
fdf38b70a0
|
Merge branch 'main' of https://github.com/BUAADreamer/LLaMA-Factory
Former-commit-id: 50cc5cf93d50c42cfcf5047bcd9b5c7959d503ae
|
2024-05-11 13:11:10 +08:00 |
|
BUAADreamer
|
1a78b675be
|
add full parameter finetuning of mllm
Former-commit-id: f90c1da5636ac3cb8112c5081a3b56b09a17fcf8
|
2024-05-11 13:11:00 +08:00 |
|
kkkl
|
9b1008912c
|
Update constants.py
Fix the download issue of the Phi3 model
Former-commit-id: 8978e80914ac6db1ed1b79641b20c84087dd4341
|
2024-05-11 00:22:40 +08:00 |
|
BUAADreamer
|
18241f4ed8
|
Merge branch 'hiyouga:main' into main
Former-commit-id: 0dd072703508f68fd4ee51b6648d0c7642a4cc93
|
2024-05-10 20:34:41 +08:00 |
|
hiyouga
|
223bbd9930
|
resolve python 3.8 package
Former-commit-id: 5eee4ec7016846356715a4fa1ad58e3cbb1cac6e
|
2024-05-09 16:52:27 +08:00 |
|
Tendo33
|
9dadff90bb
|
1.Change the name of is_fastapi_available function
2. Added the log of printing requests when deploying using vllm
Former-commit-id: 530d4f5d51c13c71d99de5fe2d23805b0aa875a2
|
2024-05-09 14:28:01 +08:00 |
|
BUAADreamer
|
827a929f1d
|
add push processor to hub
Former-commit-id: 7a05a965311edfdfafa57af8342875860d341f27
|
2024-05-09 14:05:19 +08:00 |
|
BUAADreamer
|
47892418ad
|
Merge branch 'hiyouga:main' into main
Former-commit-id: 1f3163509ecd05902ea216a905b4ca15ddd3696f
|
2024-05-09 13:45:43 +08:00 |
|
cocktailpeanut
|
2aeae4b88b
|
yet another removal of unnecessary environment variables
Former-commit-id: a07726028f0287de28e4751672b27efe0efc6477
|
2024-05-09 01:33:20 -04:00 |
|
cocktailpeanut
|
c213f2a9a9
|
more removal of unnecessary environment variables
Former-commit-id: 59ef1a6e0d81585a6c010143d05fcfae26d40c00
|
2024-05-09 01:32:00 -04:00 |
|
cocktailpeanut
|
333f4a69bb
|
remove unnecessary environment variable usage
Former-commit-id: 4be1d832cb269a07987f5cab5d5f949e269087da
|
2024-05-09 01:26:15 -04:00 |
|
BUAADreamer
|
172600d432
|
add mllm export
Former-commit-id: ce4770d33f6761d3b1d60661efcb0be34a036154
|
2024-05-08 22:50:42 +08:00 |
|
hiyouga
|
4ce4172c87
|
fix #3625
Former-commit-id: 8c0f5d1db29862277d84aa128b424b7d0f2b187f
|
2024-05-08 17:12:56 +08:00 |
|
hiyouga
|
400ae144a4
|
add llama3 chinese chat
Former-commit-id: ee3e5920f2f28567259693cb106e884a90cb02a2
|
2024-05-08 17:10:03 +08:00 |
|
hiyouga
|
0a1b6ca5a7
|
add deepseek moe 236B
Former-commit-id: 30c10e2dc41b5d64191a91ad2d61f3b5c440b1d5
|
2024-05-08 16:37:54 +08:00 |
|
BUAADreamer
|
05ef89cfcc
|
modify export model
Former-commit-id: c7051edae4ce23f85daf204a2aaac134b1f29c3d
|
2024-05-08 10:36:36 +08:00 |
|
hiyouga
|
f6ac3796ca
|
fix #3560
Former-commit-id: ea69cbe903a301df1bcc4b63cdc5bd4c6e3a8255
|
2024-05-07 19:03:35 +08:00 |
|
hiyouga
|
ebab655683
|
fix #3602
Former-commit-id: 1518b45490606ea200482da4737113c46985e8c5
|
2024-05-07 17:50:27 +08:00 |
|
hiyouga
|
e3b3a722de
|
fix stop param
Former-commit-id: f0a850c25211b72eddbb357c81679db9b0930d44
|
2024-05-07 00:41:04 +08:00 |
|
hoshi-hiyouga
|
b9e167e6ca
|
Merge pull request #3527 from zhaonx/dev
"add support for vllm api stop parameter"
Former-commit-id: e7d436403af6ac4c6a33cf36411098a0b0fefce2
|
2024-05-07 00:37:49 +08:00 |
|
hoshi-hiyouga
|
1ebd1e50e7
|
Update vllm_engine.py
Former-commit-id: fa2410de07150a82082ab5b88baf56aa891db870
|
2024-05-07 00:37:05 +08:00 |
|
hoshi-hiyouga
|
14316f6583
|
Update generating_args.py
Former-commit-id: 714957ba0159919a89fc1659a7a7b4b6bd82eead
|
2024-05-07 00:28:16 +08:00 |
|
hoshi-hiyouga
|
8e4ab2f7d0
|
Update generating_args.py
Former-commit-id: 7a9fb56786f4c40856211009656a983be1e42cb7
|
2024-05-07 00:27:56 +08:00 |
|
hiyouga
|
da2295f8c8
|
fix gradio args
Former-commit-id: 7767c1ad4b2b638b558f941ba1f0d05d4a049507
|
2024-05-06 23:33:06 +08:00 |
|
hiyouga
|
5c9da798b5
|
update docs
Former-commit-id: a4a2e94241bea6f96590f6cb8ca8b5cddee1917e
|
2024-05-06 21:47:00 +08:00 |
|
zhouwei
|
3d1b0e1864
|
The training efficiency of the Ascend 910A has been significantly enhanced, leveraging the full computational power of the NPU (Neural Processing Unit) and the capabilities of torch_npu, a PyTorch library optimized for NPUs. This improvement has resulted in a remarkable tenfold increase in efficiency.
Former-commit-id: 90980b626d3408b3e2ee32a02456c20881318be7
|
2024-05-06 13:29:59 +08:00 |
|
zhaonx96
|
45becd2a45
|
”add stop parameter in chat.py“
Former-commit-id: e529bf5bc14c72558d26f73c42076eaa9684205c
|
2024-05-06 10:10:00 +08:00 |
|
zhaonx96
|
8f1197de7e
|
Merge branch 'main' of https://github.com/zhaonx/LLaMA-Factory into dev
Former-commit-id: ec1f834905e241277fdd3f764c70eede97e9ff40
|
2024-05-06 10:09:00 +08:00 |
|
hiyouga
|
4674f3baa7
|
add version and help to cli
Former-commit-id: f762f2215169b9fe55564d5600b758ddc66f9c9c
|
2024-05-05 02:44:35 +08:00 |
|
hiyouga
|
7ef3788ff4
|
update webui
Former-commit-id: 17a53d25cdadd2df70a8afa0488f75bbf1918b89
|
2024-05-05 00:17:54 +08:00 |
|
hiyouga
|
e9fe8815be
|
remove empty stream response
Former-commit-id: 070d0da928b1e974a094279a2782201016d2a3ab
|
2024-05-04 16:13:52 +08:00 |
|
hiyouga
|
9381fecca7
|
fix async stream api response
Former-commit-id: d70bbcae6513e50aa6094f2d98c4aa5c6641ea02
|
2024-05-04 16:11:18 +08:00 |
|
hiyouga
|
efa9140577
|
update api and support abort eval in webui
Former-commit-id: 8661bed68812e9ded9439e8a821b1d7716bc797b
|
2024-05-04 15:59:15 +08:00 |
|
hiyouga
|
37bcbf72b4
|
update readme and webui launch
Former-commit-id: c66ffa57323ef6ea78a9b75ec5122d9ea25fd420
|
2024-05-04 00:43:02 +08:00 |
|
hiyouga
|
182b974786
|
fix eval in webui
Former-commit-id: 774ef2bf5823d68b9cc254a676f5adb4af533d75
|
2024-05-04 00:19:19 +08:00 |
|
hiyouga
|
7a4a6a5522
|
fix webui resume
Former-commit-id: c2f6582ddd365bb64b72e8057cc4ecd7884d2480
|
2024-05-03 23:15:19 +08:00 |
|
hiyouga
|
2383e5440c
|
fix slow op in dpo/orpo trainer
Former-commit-id: 38cad0896ea0516de6d4b2759ec9d45ee67d339b
|
2024-05-03 23:06:52 +08:00 |
|