DeepSeek-Coder/finetune
2024-04-09 13:53:44 +08:00
..
configs add deepspeed finetune 2023-11-09 22:46:45 +08:00
finetune_deepseekcoder.py Update license in finetune_deepseekcoder.py 2023-11-14 18:15:08 +08:00
README.md add mbpp instruct eval 2023-11-23 15:22:39 +08:00
requirements.txt set transformers version to 4.35 to avoid a lot of issues 2024-04-09 13:53:44 +08:00

How to Fine-tune DeepSeek-Coder

We provide script finetune_deepseekcoder.py for users to finetune our models on downstream tasks.

The script supports the training with DeepSpeed. You need install required packages by:

pip install -r requirements.txt

Please follow Sample Dataset Format to prepare your training data. Each line is a json-serialized string with two required fields instruction and output.

After data preparation, you can use the sample shell script to finetune deepseek-ai/deepseek-coder-6.7b-instruct. Remember to specify DATA_PATH, OUTPUT_PATH. And please choose appropriate hyper-parameters(e.g., learning_rate, per_device_train_batch_size) according to your scenario.

DATA_PATH="<your_data_path>"
OUTPUT_PATH="<your_output_path>"
MODEL_PATH="deepseek-ai/deepseek-coder-6.7b-instruct"

deepspeed finetune_deepseekcoder.py \
    --model_name_or_path $MODEL_PATH \
    --data_path $DATA_PATH \
    --output_dir $OUTPUT_PATH \
    --num_train_epochs 3 \
    --model_max_length 1024 \
    --per_device_train_batch_size 16 \
    --per_device_eval_batch_size 1 \
    --gradient_accumulation_steps 4 \
    --evaluation_strategy "no" \
    --save_strategy "steps" \
    --save_steps 100 \
    --save_total_limit 100 \
    --learning_rate 2e-5 \
    --warmup_steps 10 \
    --logging_steps 1 \
    --lr_scheduler_type "cosine" \
    --gradient_checkpointing True \
    --report_to "tensorboard" \
    --deepspeed configs/ds_config_zero3.json \
    --bf16 True