diff --git a/README.md b/README.md index 4f363099..45732220 100644 --- a/README.md +++ b/README.md @@ -347,7 +347,7 @@ To enable FlashAttention-2 on the Windows platform, you need to install the prec llamafactory-cli webui ``` -> [!TIPS] +> [!TIP] > To modify the default setting in the LLaMA Board GUI, you can use environment variables, e.g., `export CUDA_VISIBLE_DEVICES=0 GRADIO_SERVER_NAME=0.0.0.0 GRADIO_SERVER_PORT=7860 GRADIO_SHARE=False` (use `set` command on Windows OS).
For Alibaba Cloud users @@ -393,7 +393,7 @@ docker compose -f ./docker-compose.yml up -d See [examples/README.md](examples/README.md) for usage. -> [!TIPS] +> [!TIP] > Use `llamafactory-cli train -h` to display arguments description. ### Deploy with OpenAI-style API and vLLM diff --git a/README_zh.md b/README_zh.md index 8f9d5513..4db1f843 100644 --- a/README_zh.md +++ b/README_zh.md @@ -347,7 +347,7 @@ pip install https://github.com/jllllll/bitsandbytes-windows-webui/releases/downl llamafactory-cli webui ``` -> [!TIPS] +> [!TIP] > 您可以使用环境变量来修改 LLaMA Board 可视化界面的默认设置,例如 `export CUDA_VISIBLE_DEVICES=0 GRADIO_SERVER_NAME=0.0.0.0 GRADIO_SERVER_PORT=7860 GRADIO_SHARE=False`(Windows 系统可使用 `set` 指令)。
阿里云用户指南 @@ -393,7 +393,7 @@ docker compose -f ./docker-compose.yml up -d 使用方法请参考 [examples/README_zh.md](examples/README_zh.md)。 -> [!TIPS] +> [!TIP] > 您可以执行 `llamafactory-cli train -h` 来查看参数文档。 ### 利用 vLLM 部署 OpenAI API