update readme

This commit is contained in:
hiyouga
2024-05-04 17:01:21 +08:00
parent e984ba3167
commit 57a39783d1
2 changed files with 2 additions and 2 deletions

View File

@@ -339,7 +339,7 @@ To enable FlashAttention-2 on the Windows platform, you need to install the prec
### Train with LLaMA Board GUI (powered by [Gradio](https://github.com/gradio-app/gradio))
> [!IMPORTANT]
> LLaMA Board GUI only supports training on a single GPU, please use [CLI](#command-line-interface) for distributed training.
> LLaMA Board GUI only supports training on a single GPU, please use [CLI](#train-with-command-line-interface) for distributed training.
#### Use local environment