add Community Support of [MetaX] and [Moore Threads]

This commit is contained in:
hpp 2025-02-26 11:26:42 +08:00
parent 4edea86f9e
commit 6492cabb28

View File

@ -49,6 +49,17 @@ for i in range(num_layers):
FlashMLA is inspired by [FlashAttention 2&3](https://github.com/dao-AILab/flash-attention/) and [cutlass](https://github.com/nvidia/cutlass) projects. FlashMLA is inspired by [FlashAttention 2&3](https://github.com/dao-AILab/flash-attention/) and [cutlass](https://github.com/nvidia/cutlass) projects.
## Community Support
### MetaX
For the MetaX GPU【https://www.metax-tech.com】, the corresponding FlashMLA version link is as follows:
GitHub - [MetaX-MACA/FlashMLA](https://github.com/MetaX-MACA/FlashMLA)
### Moore Threads (WIP)
For the Moore Threads GPU【https://www.mthreads.com/】, the corresponding FlashMLA version link is as follows:
GitHub - [MooreThreads/MT-DeepSeek](https://github.com/MooreThreads/MT-DeepSeek)
## Citation ## Citation
```bibtex ```bibtex