hiyouga
|
09f3ef1de4
|
fix stop param
|
2024-05-07 00:41:04 +08:00 |
|
hoshi-hiyouga
|
17d0005b8c
|
Update vllm_engine.py
|
2024-05-07 00:37:05 +08:00 |
|
zhaonx96
|
1abd55dd59
|
Merge branch 'main' of https://github.com/zhaonx/LLaMA-Factory into dev
|
2024-05-06 10:09:00 +08:00 |
|
hiyouga
|
ed8f8be752
|
update api and support abort eval in webui
|
2024-05-04 15:59:15 +08:00 |
|
hiyouga
|
245fe47ece
|
update webui and add CLIs
|
2024-05-03 02:58:23 +08:00 |
|
zhaonx
|
42edc81585
|
"add support for vllm api stop parameter"
|
2024-04-30 17:17:09 +08:00 |
|
hiyouga
|
168f56683a
|
release v0.7.0
|
2024-04-26 23:18:00 +08:00 |
|
hiyouga
|
e057c8de48
|
support mllm hf inference
|
2024-04-26 05:34:58 +08:00 |
|
hiyouga
|
28571da80a
|
vllm + lora support
|
2024-04-25 20:24:31 +08:00 |
|
hiyouga
|
707f0b1d5d
|
fix #3347 #3387
|
2024-04-24 01:30:16 +08:00 |
|
hiyouga
|
6d641af703
|
fix #3317
|
2024-04-17 22:17:19 +08:00 |
|
hiyouga
|
cce52351b5
|
update examples
|
2024-04-15 22:14:34 +08:00 |
|
hiyouga
|
7f6e412604
|
fix requires for windows
|
2024-04-03 21:56:43 +08:00 |
|
hiyouga
|
148bda353f
|
fix resize vocab at inference #3022
|
2024-04-03 18:14:24 +08:00 |
|
hiyouga
|
07f9b754a7
|
fix #2782 #2798
|
2024-03-12 15:53:29 +08:00 |
|
hiyouga
|
412c52e325
|
fix #2766
|
2024-03-09 21:35:24 +08:00 |
|
hiyouga
|
5d956e2a51
|
fix chat engine, update webui
|
2024-03-08 03:01:53 +08:00 |
|
hiyouga
|
d07ad5cc1c
|
support vllm
|
2024-03-07 20:26:31 +08:00 |
|
hiyouga
|
54ea9684ed
|
improve fix tokenizer
|
2024-02-09 14:53:14 +08:00 |
|
hiyouga
|
ebf31b62eb
|
fix #2438
|
2024-02-06 15:23:08 +08:00 |
|
hiyouga
|
3e982cc714
|
finish agent
|
2024-01-21 01:47:33 +08:00 |
|
hiyouga
|
a9c18255aa
|
fix internlm2 template
|
2024-01-20 23:33:50 +08:00 |
|
hiyouga
|
c550987a72
|
fix cli_demo
|
2024-01-20 23:27:10 +08:00 |
|
hiyouga
|
cf818a2598
|
fix #2260
|
2024-01-20 23:22:09 +08:00 |
|
hiyouga
|
638234ceee
|
format style
|
2024-01-20 20:15:56 +08:00 |
|
hiyouga
|
83dbfce8c3
|
add tool test
|
2024-01-18 10:26:26 +08:00 |
|
hiyouga
|
d9f1cae351
|
support function calling
|
2024-01-18 09:54:23 +08:00 |
|
hiyouga
|
08464183b9
|
fix api server
|
2024-01-07 17:14:42 +08:00 |
|
hiyouga
|
0a9c6e0146
|
support system column #1765
|
2023-12-12 19:45:59 +08:00 |
|
hiyouga
|
747db40172
|
ppo support rm server
|
2023-12-03 21:38:51 +08:00 |
|
hiyouga
|
7df4f3ab20
|
implement rm server #1543
|
2023-12-03 20:52:54 +08:00 |
|
hiyouga
|
bbbce1f516
|
fix imports
|
2023-11-15 16:47:45 +08:00 |
|
hiyouga
|
4736344eb1
|
disentangle model from tuner and rename modules
|
2023-11-15 16:29:09 +08:00 |
|
hiyouga
|
8b912690e3
|
fix chat
|
2023-11-01 23:07:58 +08:00 |
|
hiyouga
|
84af10cec9
|
update gradio, support multiple resp in api
|
2023-11-01 23:02:16 +08:00 |
|
hiyouga
|
8857e45602
|
fix #887
|
2023-09-14 17:56:58 +08:00 |
|
hiyouga
|
a51b7c98ac
|
fix lora target
|
2023-09-09 17:04:45 +08:00 |
|
hiyouga
|
b34797a845
|
fix #761
|
2023-09-08 20:22:18 +08:00 |
|
hiyouga
|
85b1f6632a
|
fix baichuan templates
|
2023-09-07 18:54:14 +08:00 |
|
hiyouga
|
9f4c2adc9a
|
fix ChatGLM2 ppo #527 #528
|
2023-08-18 00:34:59 +08:00 |
|
hiyouga
|
be21fc83f9
|
fix generation bug #532
|
2023-08-17 22:21:34 +08:00 |
|
hiyouga
|
d9e62711a3
|
fix generation
|
2023-08-16 22:39:54 +08:00 |
|
hiyouga
|
7407d9daa1
|
fix system prompt
|
2023-08-16 01:35:52 +08:00 |
|
hiyouga
|
fa940c17b8
|
support rope scaling, fix #475 #476 #478
|
2023-08-12 20:46:27 +08:00 |
|
hiyouga
|
3ec4351cfd
|
support DPO training (2305.18290)
|
2023-08-11 03:02:53 +08:00 |
|
hiyouga
|
3a720aac66
|
update webui
|
2023-08-09 00:26:11 +08:00 |
|
hiyouga
|
eecc4b2131
|
fix tokenizer #417
|
2023-08-08 23:59:41 +08:00 |
|
hiyouga
|
4b841a6b35
|
fix bug
|
2023-08-08 17:55:55 +08:00 |
|
hiyouga
|
20cf27976f
|
update readme
|
2023-08-07 15:02:02 +08:00 |
|
hiyouga
|
b4852f9406
|
support chatml safe encoding
|
2023-08-04 23:14:28 +08:00 |
|