mirror of
https://github.com/hiyouga/LLaMA-Factory.git
synced 2025-08-03 04:02:49 +08:00
Update README.md
Former-commit-id: 25d326e13543894e562173c7afc9b28bff8bd1aa
This commit is contained in:
parent
eafde5b73f
commit
bff1709a8c
@ -12,6 +12,10 @@
|
||||
|
||||
\[ English | [中文](README_zh.md) \]
|
||||
|
||||
Launch an all-in-one Web UI via `python src/train_web.py`.
|
||||
|
||||
https://github.com/hiyouga/LLaMA-Factory/assets/16256802/6ba60acc-e2e2-4bec-b846-2d88920d5ba1
|
||||
|
||||
## Changelog
|
||||
|
||||
[23/09/27] We supported **$S^2$-Attn** proposed by [LongLoRA](https://github.com/dvlab-research/LongLoRA) for the LLaMA models. Try `--shift_attn` argument to enable shift short attention.
|
||||
|
Loading…
x
Reference in New Issue
Block a user