Merge pull request #234 from wangfuchun-fc/patch-1

fix: fix readme doc typo.
This commit is contained in:
Xingkai Yu 2025-01-07 17:53:28 +08:00 committed by GitHub
commit ee4c4ea32b
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -283,13 +283,13 @@ python convert.py --hf-ckpt-path /path/to/DeepSeek-V3 --save-path /path/to/DeepS
Then you can chat with DeepSeek-V3: Then you can chat with DeepSeek-V3:
```shell ```shell
torchrun --nnodes 2 --nproc-per-node 8 generate.py --node-rank $RANK --master-addr $ADDR --ckpt-path /path/to/DeepSeek-V3-Demo --config configs/config_671B.json --interactive --temperature 0.7 --max-new-tokens 200 torchrun --nnodes 2 --nproc-per-node 8 --node-rank $RANK --master-addr $ADDR generate.py --ckpt-path /path/to/DeepSeek-V3-Demo --config configs/config_671B.json --interactive --temperature 0.7 --max-new-tokens 200
``` ```
Or batch inference on a given file: Or batch inference on a given file:
```shell ```shell
torchrun --nnodes 2 --nproc-per-node 8 generate.py --node-rank $RANK --master-addr $ADDR --ckpt-path /path/to/DeepSeek-V3-Demo --config configs/config_671B.json --input-file $FILE torchrun --nnodes 2 --nproc-per-node 8 --node-rank $RANK --master-addr $ADDR generate.py --ckpt-path /path/to/DeepSeek-V3-Demo --config configs/config_671B.json --input-file $FILE
``` ```
### 6.2 Inference with SGLang (recommended) ### 6.2 Inference with SGLang (recommended)