update readme

Former-commit-id: eaf83847ef6d89d8b70429138e73b04fd2aa3ef8
This commit is contained in:
hiyouga
2024-05-04 17:01:21 +08:00
parent e9fe8815be
commit 6eda42eb7c
2 changed files with 2 additions and 2 deletions

View File

@@ -339,7 +339,7 @@ To enable FlashAttention-2 on the Windows platform, you need to install the prec
### Train with LLaMA Board GUI (powered by [Gradio](https://github.com/gradio-app/gradio))
> [!IMPORTANT]
> LLaMA Board GUI only supports training on a single GPU, please use [CLI](#command-line-interface) for distributed training.
> LLaMA Board GUI only supports training on a single GPU, please use [CLI](#train-with-command-line-interface) for distributed training.
#### Use local environment