From 3d34d44497e02d8b7758113a4f2ae8d2fe28fd5a Mon Sep 17 00:00:00 2001 From: hoshi-hiyouga Date: Sun, 15 Oct 2023 20:23:22 +0800 Subject: [PATCH] Update README.md Former-commit-id: e6fcc1831dadd2ec2c0acb14697a35f6471139ab --- README.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/README.md b/README.md index 312364a1..0e75bec0 100644 --- a/README.md +++ b/README.md @@ -12,6 +12,10 @@ \[ English | [中文](README_zh.md) \] +Launch an all-in-one Web UI via `python src/train_web.py`. + +https://github.com/hiyouga/LLaMA-Factory/assets/16256802/6ba60acc-e2e2-4bec-b846-2d88920d5ba1 + ## Changelog [23/09/27] We supported **$S^2$-Attn** proposed by [LongLoRA](https://github.com/dvlab-research/LongLoRA) for the LLaMA models. Try `--shift_attn` argument to enable shift short attention.