Highlights
- Pro
Stars
ATOMICA: Learning Universal Representations of Intermolecular Interactions
verl: Volcano Engine Reinforcement Learning for LLMs
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
Easily fine-tune, evaluate and deploy Qwen3, DeepSeek-R1, Llama 4 or any open source LLM / VLM!
Fully open reproduction of DeepSeek-R1
Minimal reproduction of DeepSeek R1-Zero
Fully open data curation for reasoning models
A high-throughput and memory-efficient inference and serving engine for LLMs
A privacy-first, open-source platform for knowledge management and collaboration. Download link: http://github.com/logseq/logseq/releases. roadmap: http://trello.com/b/8txSM12G/roadmap
Educational framework exploring ergonomic, lightweight multi-agent orchestration. Managed by OpenAI Solution team.
Task-Aware Agent-driven Prompt Optimization Framework
New repo collection for NVIDIA Cosmos: https://github.com/nvidia-cosmos
Recipes to scale inference-time compute of open models
HAT (Hard Attention to the Task) Modules for Continual Learning
👽 Out-of-Distribution Detection with PyTorch
PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.
Benchmarking Generalized Out-of-Distribution Detection
Flexible and powerful data analysis / manipulation library for Python, providing labeled data structures similar to R data.frame objects, statistical functions, and much more
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (V…
Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.
A lightweight yet functional project template for various deep learning projects.
A python library for self-supervised learning on images.
Awesome Incremental Learning
Making large AI models cheaper, faster and more accessible
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.
Mathematical derivation and pure Python code implementation of machine learning algorithms.
A Collection of Variational Autoencoders (VAE) in PyTorch.
Ongoing research training transformer language models at scale, including: BERT & GPT-2
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries