Yaowei Zheng
|
5817583630
|
[deps] bump transformers to 4.49.0 (#8564)
|
2025-07-07 20:31:50 +08:00 |
|
Yaowei Zheng
|
ed57b7ba2a
|
[webui] upgrade webui and fix api (#8460)
|
2025-06-25 21:59:58 +08:00 |
|
Yaowei Zheng
|
b10333dafb
|
[model] do not force load processor (#8457)
|
2025-06-25 19:43:00 +08:00 |
|
Yaowei Zheng
|
fba9c9d9b9
|
[deps] upgrade transformers to 4.52.4 (#8245)
|
2025-05-31 16:51:40 +08:00 |
|
hoshi-hiyouga
|
dc8cca11b3
|
[deps] upgrade transformers (#8159)
|
2025-05-26 22:03:58 +08:00 |
|
hoshi-hiyouga
|
b0c8ba73e0
|
[deps] update to transformers 4.52 (#8125)
|
2025-05-21 05:16:18 +08:00 |
|
hoshi-hiyouga
|
f3fd67a9bb
|
[model] switch to gptqmodel (#8108)
|
2025-05-19 22:25:40 +08:00 |
|
hoshi-hiyouga
|
610f164c69
|
[trainer] fix pt loss (#7748)
* fix pt loss
* robust
* fix
* test
|
2025-04-17 03:15:35 +08:00 |
|
hoshi-hiyouga
|
0a0cfeb782
|
[breaking] bump transformers to 4.45.0 & improve ci (#7746)
* update ci
* fix
* fix
* fix
* fix
* fix
|
2025-04-17 02:36:48 +08:00 |
|
hoshi-hiyouga
|
ac8c6fdd3a
|
[assets] update model readme (#7724)
|
2025-04-15 00:41:09 +08:00 |
|
hoshi-hiyouga
|
3a13d2cdb1
|
[misc] fix env vars (#7715)
|
2025-04-14 16:04:04 +08:00 |
|
hoshi-hiyouga
|
1fd4d14fbb
|
[deps] upgrade transformers (#7704)
|
2025-04-13 18:11:34 +08:00 |
|
jilongW
|
3bdc7e1e6c
|
[misc] fix cuda warn on intel GPU (#7655)
|
2025-04-09 21:37:54 +08:00 |
|
hoshi-hiyouga
|
34fdabe005
|
[data] add coig-p dataset (#7657)
|
2025-04-09 21:18:25 +08:00 |
|
hoshi-hiyouga
|
39876b85fc
|
[assets] update readme (#7644)
|
2025-04-09 01:06:06 +08:00 |
|
hoshi-hiyouga
|
6c200fd218
|
[model] add llama4 (#7611)
|
2025-04-06 13:42:31 +08:00 |
|
hoshi-hiyouga
|
aaf2e6ba2a
|
[model] fix kv cache (#7564)
|
2025-04-01 23:07:46 +08:00 |
|
hoshi-hiyouga
|
59e12bffe8
|
[model] add qwen2vl 32b & upgrade peft (#7469)
* add qwen2vl 32b
* fix ci
* upgrade peft to 0.15
* fix ci
* fix ci
|
2025-03-25 12:15:58 +08:00 |
|
GuoCoder
|
b6d8749bf3
|
[model] fix lora on quant models (#7456)
Co-authored-by: root <root@ai>
|
2025-03-25 11:59:46 +08:00 |
|
hoshi-hiyouga
|
b1b78daf06
|
[deps] upgrade transformers to 4.50.0 (#7437)
* upgrade transformers
* fix hf cache
* fix dpo trainer
|
2025-03-23 17:44:27 +08:00 |
|
Qiaolin Yu
|
30038d9ce7
|
[inference] support sglang backend (#7278)
* Mimic SGLang offline Engine
* Add more tests and args
* Pass all current tests
* Clean Code
* fix sample_params
* clean code
* Fix Stream Chat
* change sglang from engine mode to server mode
* fix
* Fix Review Issues
* Use SGLang Built-In Utilities
* Fix test SGLang
* Some Doc Issue
* fix sglang engine
* add readme
---------
Co-authored-by: Jin Pan <jpan236@wisc.edu>
Co-authored-by: hiyouga <hiyouga@buaa.edu.cn>
|
2025-03-15 04:37:58 +08:00 |
|
hoshi-hiyouga
|
9ccfb97a2c
|
[misc] update format (#7277)
|
2025-03-13 02:53:08 +08:00 |
|
hoshi-hiyouga
|
142fd7e755
|
[misc] upgrade deps (#7257)
|
2025-03-12 00:33:47 +08:00 |
|
hoshi-hiyouga
|
7c1640ed5f
|
[misc] upgrade format to py39 (#7256)
|
2025-03-12 00:08:41 +08:00 |
|
hoshi-hiyouga
|
3fbd4848e8
|
[version] support transformers 449 (#6982)
* support transformers 449
* fix mm plugin
Former-commit-id: b00b290c07beb560a5af857ce64f4ce424831a2c
|
2025-02-18 17:05:40 +08:00 |
|
hoshi-hiyouga
|
ff6658ad27
|
[deps] upgrade vllm (#6857)
Former-commit-id: 5f38bcaba921dbdee27b4be4709fcec06fa37c9e
|
2025-02-08 15:02:28 +08:00 |
|
hoshi-hiyouga
|
445d643ef3
|
[model] add mistral small models (#6786)
Former-commit-id: 94803d8133fbbadff6d224cb6695feb5434fd4fd
|
2025-02-01 04:31:38 +08:00 |
|
hoshi-hiyouga
|
f6779b0e0c
|
[breaking] support transformers 4.48 (#6628)
Former-commit-id: 15357cdad953bba1f2d294819f56b9746ed1b891
|
2025-01-31 01:36:33 +08:00 |
|
hiyouga
|
da542fad18
|
imporve log
Former-commit-id: 47e17dd689840ca9b3c5f34448e5f80265336cca
|
2025-01-08 09:56:10 +00:00 |
|
hiyouga
|
b4174021d6
|
refactor ray integration, support save ckpt
Former-commit-id: d8cac6f54663e6cffeddf2c65e3da454e7b86a75
|
2025-01-07 09:39:10 +00:00 |
|
hiyouga
|
813f5919a3
|
fix #6482
Former-commit-id: 6f5bb3b8e5b6eb7fdfd7b0ca8eba789ab741a7b6
|
2024-12-30 06:03:07 +00:00 |
|
hiyouga
|
235cdcacee
|
support batch infer in vllm
Former-commit-id: 1324d158f954d777f1fbf09f46149c372704b388
|
2024-12-04 13:50:00 +00:00 |
|
Ting
|
e27a0c3d53
|
code refactor
Former-commit-id: 40627c601efc9f144a227dded8c6b40babff4e8b
|
2024-11-19 20:33:18 +08:00 |
|
hiyouga
|
3730fc046f
|
update datasets version
Former-commit-id: c5fae465ec8cbc30f9e91e6c32b88e74c805874a
|
2024-11-04 07:52:26 +00:00 |
|
hiyouga
|
e83cb17f97
|
support rank0 logger
Former-commit-id: c38aa29336f286266553da4909a7267d7ef21f37
|
2024-11-02 18:31:04 +08:00 |
|
hiyouga
|
584ce3a105
|
fix incorrect loss value for vlms
Former-commit-id: 30567a1487727473950104718e626ff660f10cbb
|
2024-10-30 08:56:46 +00:00 |
|
hiyouga
|
163cf2ba5c
|
update requires
Former-commit-id: 77666bd2278a3cfe5b567f4fe285b0f93871d166
|
2024-10-29 16:10:07 +08:00 |
|
hiyouga
|
e90a1199da
|
tiny fix
Former-commit-id: 3af57795dda5d236200bad4aa3f2e29ae8930fe2
|
2024-10-11 23:51:54 +08:00 |
|
huniu20
|
5a3280ebee
|
bugs fixed
Former-commit-id: 843b5d85e98b312e5d41ce62ec10e199011beb8c
|
2024-10-11 19:56:13 +08:00 |
|
huniu20
|
26e897e861
|
1. add modelers hub support
Former-commit-id: 24ebe187e360753666b768685a0dcc78054bb702
|
2024-10-09 17:21:37 +08:00 |
|
hiyouga
|
4464a6ff5b
|
tiny fix
Former-commit-id: 451d271718a8026056d0f7d7b8ab333391d24ad4
|
2024-10-08 17:48:56 +08:00 |
|
hiyouga
|
38505ae9e1
|
update accelerate ver for schedule_free optimizers
Former-commit-id: bdde35fd2e4a919c1d63ebfc9a0ea8ba0c97e14c
|
2024-09-09 22:51:08 +08:00 |
|
hiyouga
|
cb776752f6
|
fix mixed mm inputs and rlhf-v
Former-commit-id: 9967ccb3aef3ca557ad6eafb78c6c99866857008
|
2024-09-01 20:52:47 +08:00 |
|
hiyouga
|
a83756b5e9
|
refactor mm training
Former-commit-id: 3382317e32f88ed377d3e7759bdeaf0f2559d22a
|
2024-08-30 02:14:31 +08:00 |
|
hiyouga
|
21d3976eea
|
fix #5295
Former-commit-id: ad72f3e06593f124d661d61774def336511716e0
|
2024-08-29 20:30:18 +08:00 |
|
hiyouga
|
7b5834b2dd
|
tiny fix
Former-commit-id: f6ae4e75ddaeb4ac4a527f0141ac5b1afefde10e
|
2024-08-27 12:49:32 +08:00 |
|
hiyouga
|
daebca2368
|
tiny fix
Former-commit-id: c8b4c7fee5398654683b713ad5c03b5daf13218a
|
2024-08-20 00:10:52 +08:00 |
|
hoshi-hiyouga
|
5582674f06
|
Merge pull request #5188 from Zxilly/main
fix: report correct device count for intel xpu
Former-commit-id: d39f4a62d3c5a3bbbf39d1eb4b92439acedae18e
|
2024-08-19 23:51:39 +08:00 |
|
Ricardo
|
a9312387bc
|
_is_bf16_available judgment supports npu
Former-commit-id: 384ab8db84eef7d1f6a7643c15c565a7d4906a5c
|
2024-08-16 02:58:22 +00:00 |
|
Zxilly
|
41a8387195
|
fix: report correct device count for intel xpu
Former-commit-id: dc36fcc3de721bdd28edd4eed36677e59a7614be
|
2024-08-15 08:30:43 +00:00 |
|