Starred repositories
Implementation of paper "Uni-3DAR: Unified 3D Generation and Understanding via Autoregression on Compressed Spatial Tokens"
Helpful tools and examples for working with flex-attention
ROCm / flash-attention
Forked from Dao-AILab/flash-attentionFast and memory-efficient exact attention
A lightweight pure python package for reading, writing and manipulating mmCIF files distributed by the wwPDB"
A massively parallel, high-level programming language
Tile primitives for speedy kernels
DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model
SciAssess is a comprehensive benchmark for evaluating Large Language Models' proficiency in scientific literature analysis across various fields, focusing on memorization, comprehension, and analysis.
An amazing UI for OpenAI's ChatGPT (Website + Windows + MacOS + Linux)
✨ Light and Fast AI Assistant. Support: Web | iOS | MacOS | Android | Linux | Windows
A package for tree-based statistical estimation and inference using optimal decision trees.
A high-throughput and memory-efficient inference and serving engine for LLMs
The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
BiliBili Client Demo for Apple TV (tvOS)
Official Repository for the Uni-Mol Series Methods
An open-source platform for developing protein models beyond AlphaFold.
an efficient distributed PyTorch framework
Fast and memory-efficient exact attention
The deep potential generator to generate a deep-learning based model of interatomic potential energy and force field
A deep learning package for many-body potential energy representation and molecular dynamics
DMFF (Differentiable Molecular Force Field) is a Jax-based python package that provides a full differentiable implementation of molecular force field models.
Beyond the Imitation Game collaborative benchmark for measuring and extrapolating the capabilities of language models
LiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training