Update README.md

This commit is contained in:
Fuli Luo 2023-11-05 18:17:08 +08:00 committed by GitHub
parent 4f1e966274
commit 4e864b7b33
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -7,13 +7,13 @@
### 1. Introduction of DeepSeek Coder
Deepseek Coder comprises a series of code language models trained on both 87% code and 13% natural language in English and Chinese, with each model pre-trained on 2T tokens. We provide various sizes of the code model, ranging from 1B to 33B versions. Each model is pre-trained on project-level code corpus by employing a window size of 16K and a extra fill-in-the-blank task, to support project-level code completion and infilling. For coding capabilities, Deepseek Coder achieves state-of-the-art performance among open-source code models on multiple programming languages and various benchmarks.
Deepseek Coder consists of a series of code language models, each trained from scratch on a dataset of 2 trillion tokens, composed of 87% code and 13% natural language, encompassing both English and Chinese. We provide various sizes of the code model, ranging from 1B to 33B versions. Each model is pre-trained on project-level code corpus by employing a window size of 16K and an extra fill-in-the-blank task, to support project-level code completion and infilling. For coding capabilities, Deepseek Coder achieves state-of-the-art performance among open-source code models on multiple programming languages and various benchmarks.
<p align="center">
<img src="pictures/result.png" alt="result" width="70%">
</p>
- **Massive Training Data**: Trained on 2T tokens, including 87% code and 13% linguistic data in both English and Chinese languages.
- **Massive Training Data**: Trained from scratch on 2T tokens, including 87% code and 13% linguistic data in both English and Chinese languages.
- **Highly Flexible & Scalable**: Offered in model sizes of 1B, 5.7B, 6.7B and 33B, enabling users to choose the setup most suitable for their requirements.