mirror of
https://github.com/deepseek-ai/DeepSeek-Coder
synced 2024-12-04 18:14:44 +00:00
.. | ||
configs | ||
finetune_deepseekcoder.py | ||
README.md | ||
requirements.txt |
How to Fine-tune DeepSeek-Coder
We provide script finetune_deepseekcoder.py
for users to finetune our models on downstream tasks.
The script supports the training with DeepSpeed. You need install required packages by:
pip install -r requirements.txt
Please follow Sample Dataset Format to prepare your training data.
Each line is a json-serialized string with two required fields instruction
and output
.
After data preparation, you can use the sample shell script to finetune deepseek-ai/deepseek-coder-6.7b-instruct
.
Remember to specify DATA_PATH
, OUTPUT_PATH
.
And please choose appropriate hyper-parameters(e.g., learning_rate
, per_device_train_batch_size
) according to your scenario.
DATA_PATH="<your_data_path>"
OUTPUT_PATH="<your_output_path>"
MODEL="deepseek-ai/deepseek-coder-6.7b-instruct"
deepspeed finetune_deepseekcoder.py \
--model_name_or_path $MODEL_PATH \
--data_path $DATA_PATH \
--output_dir $OUTPUT_PATH \
--num_train_epochs 3 \
--model_max_length 1024 \
--per_device_train_batch_size 16 \
--per_device_eval_batch_size 1 \
--gradient_accumulation_steps 4 \
--evaluation_strategy "no" \
--save_strategy "steps" \
--save_steps 100 \
--save_total_limit 100 \
--learning_rate 2e-5 \
--warmup_steps 10 \
--logging_steps 1 \
--lr_scheduler_type "cosine" \
--gradient_checkpointing True \
--report_to "tensorboard" \
--deepspeed configs/ds_config_zero3.json \
--bf16 True