This website requires JavaScript.
Explore
Help
Register
Sign In
423A35C7
/
LLaMA-Factory
Watch
1
Star
0
Fork
0
You've already forked LLaMA-Factory
mirror of
https://github.com/hiyouga/LLaMA-Factory.git
synced
2025-10-15 16:18:10 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
LLaMA-Factory
/
src
/
llmtuner
/
tuner
/
core
History
hiyouga
a402161631
support FlashAttention2
...
Former-commit-id: 23e56c5554b948d4f08ad87849b261eafd2c7890
2023-09-10 20:43:56 +08:00
..
__init__.py
modity code structure
2023-07-15 16:54:28 +08:00
adapter.py
fix lora target
2023-09-09 17:04:45 +08:00
loader.py
support FlashAttention2
2023-09-10 20:43:56 +08:00
parser.py
change to right-padding, update reward score
#803
2023-09-08 20:04:31 +08:00
trainer.py
change to right-padding, update reward score
#803
2023-09-08 20:04:31 +08:00
utils.py
fix lora target
2023-09-09 17:04:45 +08:00