This website requires JavaScript.
Explore
Help
Register
Sign In
423A35C7
/
LLaMA-Factory
Watch
1
Star
0
Fork
0
You've already forked LLaMA-Factory
mirror of
https://github.com/hiyouga/LLaMA-Factory.git
synced
2025-08-05 05:02:50 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
LLaMA-Factory
/
src
/
llmtuner
/
tuner
History
hiyouga
8ab5566dc0
support FlashAttention2
...
Former-commit-id: d8aa1404bee9842f3e4cd037ad8d66c85470ac37
2023-09-10 20:43:56 +08:00
..
core
support FlashAttention2
2023-09-10 20:43:56 +08:00
dpo
fix bug in DPO data collator
2023-09-08 20:45:07 +08:00
ppo
fix lora target
2023-09-09 17:04:45 +08:00
pt
update training resuming
2023-08-18 01:41:17 +08:00
rm
fix lora target
2023-09-09 17:04:45 +08:00
sft
support FlashAttention2
2023-09-10 20:43:56 +08:00
__init__.py
modify code structure
2023-08-02 23:17:36 +08:00
tune.py
support rope scaling,
fix
#475
#476
#478
2023-08-12 20:46:27 +08:00