This website requires JavaScript.
Explore
Help
Register
Sign In
423A35C7
/
LLaMA-Factory
Watch
1
Star
0
Fork
0
You've already forked LLaMA-Factory
mirror of
https://github.com/hiyouga/LLaMA-Factory.git
synced
2025-08-04 20:52:59 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
LLaMA-Factory
/
src
/
llmtuner
/
extras
History
hiyouga
982e0e79c2
fix flashattn warning
...
Former-commit-id: 4bd8e3906d09bf6ec4b8f6b553a347fca9db4f80
2023-11-10 18:34:54 +08:00
..
patches
fix flashattn warning
2023-11-10 18:34:54 +08:00
__init__.py
modity code structure
2023-07-15 16:54:28 +08:00
callbacks.py
fix eval resuming in webui
2023-10-15 15:45:38 +08:00
constants.py
refactor constants
2023-11-10 14:16:10 +08:00
logging.py
support rope scaling,
fix
#475
#476
#478
2023-08-12 20:46:27 +08:00
misc.py
refactor model_dtype, fix PPO trainer
2023-10-11 23:16:01 +08:00
ploting.py
release v0.1.0
2023-07-18 00:18:25 +08:00
template.py
refactor constants
2023-11-10 14:16:10 +08:00