mirror of
https://github.com/deepseek-ai/FlashMLA
synced 2025-04-23 15:44:07 +00:00
reformat Community Support section
This commit is contained in:
parent
77d9d8d21b
commit
1aef31d163
21
README.md
21
README.md
@ -51,43 +51,34 @@ FlashMLA is inspired by [FlashAttention 2&3](https://github.com/dao-AILab/flash-
|
|||||||
|
|
||||||
## Community Support
|
## Community Support
|
||||||
|
|
||||||
### MetaX
|
### MetaX
|
||||||
|
|
||||||
For MetaX GPUs, visit the official website: [MetaX](https://www.metax-tech.com).
|
For MetaX GPUs, visit the official website: [MetaX](https://www.metax-tech.com).
|
||||||
|
|
||||||
The corresponding FlashMLA version can be found at:
|
The corresponding FlashMLA version can be found at: [MetaX-MACA/FlashMLA](https://github.com/MetaX-MACA/FlashMLA)
|
||||||
[MetaX-MACA/FlashMLA](https://github.com/MetaX-MACA/FlashMLA)
|
|
||||||
|
|
||||||
|
|
||||||
### Moore Threads
|
### Moore Threads
|
||||||
For the Moore Threads GPU, visit the official website: [Moore Threads](https://www.mthreads.com/).
|
For the Moore Threads GPU, visit the official website: [Moore Threads](https://www.mthreads.com/).
|
||||||
|
|
||||||
The corresponding FlashMLA version is available on GitHub:
|
The corresponding FlashMLA version is available on GitHub: [MooreThreads/MT-flashMLA](https://github.com/MooreThreads/MT-flashMLA).
|
||||||
[MooreThreads/MT-flashMLA](https://github.com/MooreThreads/MT-flashMLA).
|
|
||||||
|
|
||||||
|
|
||||||
### Hygon DCU
|
### Hygon DCU
|
||||||
|
|
||||||
For the Hygon DCU, visit the official website: [Hygon Developer](https://developer.sourcefind.cn/).
|
For the Hygon DCU, visit the official website: [Hygon Developer](https://developer.sourcefind.cn/).
|
||||||
|
|
||||||
The corresponding FlashMLA version is available here:
|
The corresponding FlashMLA version is available here: [OpenDAS/MLAttention](https://developer.sourcefind.cn/codes/OpenDAS/MLAttention).
|
||||||
[OpenDAS/MLAttention](https://developer.sourcefind.cn/codes/OpenDAS/MLAttention).
|
|
||||||
|
|
||||||
|
|
||||||
### Intellifusion
|
### Intellifusion
|
||||||
|
|
||||||
For the Intellifusion NNP, visit the official website: [Intellifusion](https://www.intellif.com).
|
For the Intellifusion NNP, visit the official website: [Intellifusion](https://www.intellif.com).
|
||||||
|
|
||||||
The corresponding FlashMLA version is available on Gitee:
|
The corresponding FlashMLA version is available on Gitee: [Intellifusion/tyllm](https://gitee.com/Intellifusion_2025/tyllm/blob/master/python/tylang/flash_mla.py).
|
||||||
[Intellifusion/tyllm](https://gitee.com/Intellifusion_2025/tyllm/blob/master/python/tylang/flash_mla.py).
|
|
||||||
|
|
||||||
|
|
||||||
### Iluvatar Corex
|
### Iluvatar Corex
|
||||||
|
|
||||||
For Iluvatar Corex GPUs, visit the official website: [Iluvatar Corex](https://www.iluvatar.com).
|
For Iluvatar Corex GPUs, visit the official website: [Iluvatar Corex](https://www.iluvatar.com).
|
||||||
|
|
||||||
The corresponding FlashMLA version is available on GitHub:
|
The corresponding FlashMLA version is available on GitHub: [Deep-Spark/FlashMLA](https://github.com/Deep-Spark/FlashMLA/tree/iluvatar_flashmla)
|
||||||
[Deep-Spark/FlashMLA](https://github.com/Deep-Spark/FlashMLA/tree/iluvatar_flashmla)
|
|
||||||
|
|
||||||
## Citation
|
## Citation
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user