Skip to content

goog00/Awesome-Triton-Resources

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 

Repository files navigation

Resource

  1. Official Repo
  2. Official Tutorial
  3. Triton-Puzzles
  4. fla: Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton
  5. Attorch: A subset of PyTorch's neural network modules, written in Python using OpenAI's Triton.
  6. Trident: A performance library for machine learning applications.
  7. triton-transformer: Implementation of a Transformer, but completely in Triton
  8. unsloth
  9. Kernl
  10. Liger-Kernel
  11. efficient_cross_entropy
  12. Xformers kernels
  13. Flash Attention kernels
  14. FlagGems: FlagGems is an operator library for large language models implemented in Triton Language.
  15. applied-ai: Applied AI experiments and examples for PyTorch
  16. FlagAttention: A collection of memory efficient attention operators implemented in the Triton language.
  17. triton-activations: Collection of neural network activation function kernels for Triton Language Compiler by OpenAI
  18. FLASHNN
  19. GPTQ-triton
  20. Accelerating Triton Dequantization Kernels for GPTQ
    1. Code
  21. Block-sparse matrix multiplication kernels
    1. https://openreview.net/pdf?id=doa11nN5vG
  22. bitsandbytes
  23. flex-attention
    1. https://pytorch.org/blog/flexattention/
    2. https://github.com/pytorch-labs/attention-gym
  24. conv
  25. lightning_attention
  26. int mm
  27. low-bit matmul kernels
  28. linear cross entropy
  29. optimized_hf_llama_class_for_training
  30. kernel-hyperdrive
  31. scattermoe
  32. triton-dejavu
  33. Triformer
  34. σ-MoE layer
  35. fast linear attn

Reference

This library is inspired by Awesome-Triton-Kernels.

About

Awesome Triton Resources

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published