Showing 19 open source projects for "tpu"

View related business solutions
  • Gen AI apps are built with MongoDB Atlas Icon
    Gen AI apps are built with MongoDB Atlas

    The database for AI-powered applications.

    MongoDB Atlas is the developer-friendly database used to build, scale, and run gen AI and LLM-powered apps—without needing a separate vector database. Atlas offers built-in vector search, global availability across 115+ regions, and flexible document modeling. Start building AI apps faster, all in one place.
    Start Free
  • Simple, Secure Domain Registration Icon
    Simple, Secure Domain Registration

    Get your domain at wholesale price. Cloudflare offers simple, secure registration with no markups, plus free DNS, CDN, and SSL integration.

    Register or renew your domain and pay only what we pay. No markups, hidden fees, or surprise add-ons. Choose from over 400 TLDs (.com, .ai, .dev). Every domain is integrated with Cloudflare's industry-leading DNS, CDN, and free SSL to make your site faster and more secure. Simple, secure, at-cost domain registration.
    Sign up for free
  • 1
    TensorFlow

    TensorFlow

    TensorFlow is an open source library for machine learning

    Originally developed by Google for internal use, TensorFlow is an open source platform for machine learning. Available across all common operating systems (desktop, server and mobile), TensorFlow provides stable APIs for Python and C as well as APIs that are not guaranteed to be backwards compatible or are 3rd party for a variety of other languages. The platform can be easily deployed on multiple CPUs, GPUs and Google's proprietary chip, the tensor processing unit (TPU). TensorFlow...
    Downloads: 16 This Week
    Last Update:
    See Project
  • 2
    Brax

    Brax

    Massively parallel rigidbody physics simulation

    Brax is a fast and fully differentiable physics engine for large-scale rigid body simulations, built on JAX. It is designed for research in reinforcement learning and robotics, enabling efficient simulations and gradient-based optimization.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 3
    OpenCLIP

    OpenCLIP

    An open source implementation of CLIP

    ... trained on the same subset of YFCC. For ease of experimentation, we also provide code for training on the 3 million images in the Conceptual Captions dataset, where a ResNet-50x4 trained with our codebase reaches 22.2% top-1 ImageNet accuracy. This codebase is work in progress, and we invite all to contribute in making it more accessible and useful. In the future, we plan to add support for TPU training and release larger models. We hope this codebase facilitates and promotes further research.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 4
    RLax

    RLax

    Library of JAX-based building blocks for reinforcement learning agents

    ... as value-based, policy-based, and model-based approaches. RLax is fully JIT-compilable with JAX, enabling high-performance execution across CPU, GPU, and TPU backends. The library implements tools for Bellman equations, return distributions, general value functions, and policy optimization in both continuous and discrete action spaces. It integrates seamlessly with DeepMind’s Haiku (for neural network definition) and Optax (for optimization), making it a key component in modular RL pipelines.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • 5
    Neural Tangents

    Neural Tangents

    Fast and Easy Infinite Neural Networks in Python

    Neural Tangents is a high-level neural network API for specifying complex, hierarchical models at both finite and infinite width, built in Python on top of JAX and XLA. It lets researchers define architectures from familiar building blocks—convolutions, pooling, residual connections, and nonlinearities—and obtain not only the finite network but also the corresponding Gaussian Process (GP) kernel of its infinite-width limit. With a single specification, you can compute NNGP and NTK kernels,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    Gemma

    Gemma

    Gemma open-weight LLM library, from Google DeepMind

    Gemma, developed by Google DeepMind, is a family of open-weights large language models (LLMs) built upon the research and technology behind Gemini. This repository provides the official implementation of the Gemma PyPI package, a JAX-based library that enables users to load, interact with, and fine-tune Gemma models. The framework supports both text and multi-modal input, allowing natural language conversations that incorporate visual content such as images. It includes APIs for...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    Haiku Sonnet for JAX

    Haiku Sonnet for JAX

    JAX-based neural network library

    Haiku is a library built on top of JAX designed to provide simple, composable abstractions for machine learning research. JAX is a numerical computing library that combines NumPy, automatic differentiation, and first-class GPU/TPU support. Haiku is a simple neural network library for JAX that enables users to use familiar object-oriented programming models while allowing full access to JAX's pure function transformations. Haiku provides two core tools: a module abstraction, hk.Module...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    TensorFlow Probability

    TensorFlow Probability

    Probabilistic reasoning and statistical analysis in TensorFlow

    TensorFlow Probability is a library for probabilistic reasoning and statistical analysis. TensorFlow Probability (TFP) is a Python library built on TensorFlow that makes it easy to combine probabilistic models and deep learning on modern hardware (TPU, GPU). It's for data scientists, statisticians, ML researchers, and practitioners who want to encode domain knowledge to understand data and make predictions. Since TFP inherits the benefits of TensorFlow, you can build, fit, and deploy a model...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    TensorFlow Model Garden

    TensorFlow Model Garden

    Models and examples built with TensorFlow

    ... are suitable. A flexible and lightweight library that users can easily use or fork when writing customized training loop code in TensorFlow 2.x. It seamlessly integrates with tf.distribute and supports running on different device types (CPU, GPU, and TPU).
    Downloads: 0 This Week
    Last Update:
    See Project
  • Build Securely on AWS with Proven Frameworks Icon
    Build Securely on AWS with Proven Frameworks

    Lay a foundation for success with Tested Reference Architectures developed by Fortinet’s experts. Learn more in this white paper.

    Moving to the cloud brings new challenges. How can you manage a larger attack surface while ensuring great network performance? Turn to Fortinet’s Tested Reference Architectures, blueprints for designing and securing cloud environments built by cybersecurity experts. Learn more and explore use cases in this white paper.
    Download Now
  • 10
    TensorFlow Documentation

    TensorFlow Documentation

    TensorFlow documentation

    An end-to-end platform for machine learning. TensorFlow makes it easy to create ML models that can run in any environment. Learn how to use the intuitive APIs through interactive code samples.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 11
    GPT-NeoX

    GPT-NeoX

    Implementation of model parallel autoregressive transformers on GPUs

    This repository records EleutherAI's library for training large-scale language models on GPUs. Our current framework is based on NVIDIA's Megatron Language Model and has been augmented with techniques from DeepSpeed as well as some novel optimizations. We aim to make this repo a centralized and accessible place to gather techniques for training large-scale autoregressive language models, and accelerate research into large-scale training. For those looking for a TPU-centric codebase, we...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    Tensorflow Transformers

    Tensorflow Transformers

    State of the art faster Transformer with Tensorflow 2.0

    ... speech recognition and audio classification. Faster AutoReggressive Decoding, TFlite support, creating TFRecords is simple. Auto-Batching tf.data.dataset or tf.ragged tensors. Everything is dictionary (inputs and outputs) Multiple mask modes like causal, user-defined, prefix. tensorflow-text tokenizer support. Supports GPU, TPU, multi-GPU trainer with wandb, multiple callbacks, auto tensorboard.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    bert4keras

    bert4keras

    Keras implement of transformers for humans

    ... by the language model and seq2seq. Pre-training code from zero (supports TPU, multi-GPU, please see pertaining). Compatible with keras, tf.keras.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    GPT Neo

    GPT Neo

    An implementation of model parallel GPT-2 and GPT-3-style models

    An implementation of model & data parallel GPT3-like models using the mesh-tensorflow library. If you're just here to play with our pre-trained models, we strongly recommend you try out the HuggingFace Transformer integration. Training and inference is officially supported on TPU and should work on GPU as well. This repository will be (mostly) archived as we move focus to our GPU-specific repo, GPT-NeoX. NB, while neo can technically run a training step at 200B+ parameters, it is very...
    Downloads: 6 This Week
    Last Update:
    See Project
  • 15
    TensorNetwork

    TensorNetwork

    A library for easy and efficient manipulation of tensor networks

    TensorNetwork is a high-level library for building and contracting tensor networks—graphical factorizations of large tensors that underpin many algorithms in physics and machine learning. It abstracts networks as nodes and edges, then compiles efficient contraction orders across multiple numeric backends so users can focus on model structure rather than index bookkeeping. Common network families (MPS/TT, PEPS, MERA, tree networks) are expressed with concise APIs that encourage...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    gpt-j-api

    gpt-j-api

    API for the GPT-J language mode. Including a FastAPI backend

    An API to interact with the GPT-J language model and variants! You can use and test the model in two different ways. These are the endpoints of the public API and require no authentication. Just SSH into a TPU VM. This code was tested on both the v2-8 and v3-8 variants.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 17
    Tez

    Tez

    Tez is a super-simple and lightweight Trainer for PyTorch

    Tez is a super-simple and lightweight Trainer for PyTorch. It also comes with many utils that you can use to tackle over 90% of deep learning projects in PyTorch. tez (तेज़ / تیز) means sharp, fast & active. This is a simple, to-the-point, library to make your PyTorch training easy. This library is in early-stage currently! So, there might be breaking changes. Currently, tez supports cpu, single gpu and multi-gpu & tpu training. More coming soon! Using tez is super-easy. We don't want you...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    hebrew-gpt_neo

    hebrew-gpt_neo

    Hebrew text generation models based on EleutherAI's gpt-neo

    Hebrew text generation models based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made available to me via the TPU Research Cloud Program. The Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    automl-gs

    automl-gs

    Provide an input CSV and a target field to predict, generate a model

    Give an input CSV file and a target field you want to predict to automl-gs, and get a trained high-performing machine learning or deep learning model plus native Python code pipelines allowing you to integrate that model into any prediction workflow. No black box: you can see exactly how the data is processed, and how the model is constructed, and you can make tweaks as necessary. automl-gs is an AutoML tool which, unlike Microsoft's NNI, Uber's Ludwig, and TPOT, offers a zero code/model...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next