Search Results for "automatic differentiation"

Showing 68 open source projects for "automatic differentiation"

View related business solutions
  • MongoDB Atlas runs apps anywhere Icon
    MongoDB Atlas runs apps anywhere

    Deploy in 115+ regions with the modern database for every enterprise.

    MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
    Start Free
  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • 1
    ImplicitDifferentiation.jl

    ImplicitDifferentiation.jl

    Automatic differentiation of implicit functions

    ImplicitDifferentiation.jl is a package for automatic differentiation of functions defined implicitly, i.e., forward mappings. Those for which automatic differentiation fails. Reasons can vary depending on your backend, but the most common include calls to external solvers, mutating operations or type restrictions. Those for which automatic differentiation is very slow. A common example is iterative procedures like fixed point equations or optimization algorithms.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 2
    ForwardDiff.jl

    ForwardDiff.jl

    Forward Mode Automatic Differentiation for Julia

    ForwardDiff implements methods to take derivatives, gradients, Jacobians, Hessians, and higher-order derivatives of native Julia functions (or any callable object, really) using forward mode automatic differentiation (AD). While performance can vary depending on the functions you evaluate, the algorithms implemented by ForwardDiff generally outperform non-AD algorithms (such as finite-differencing) in both speed and accuracy. Functions like f which map a vector to a scalar are the best case for reverse-mode automatic differentiation, but ForwardDiff may still be a good choice if x is not too large, as it is much simpler. ...
    Downloads: 3 This Week
    Last Update:
    See Project
  • 3
    ReverseDiff

    ReverseDiff

    Reverse Mode Automatic Differentiation for Julia

    ReverseDiff is a fast and compile-able tape-based reverse mode automatic differentiation (AD) that implements methods to take gradients, Jacobians, Hessians, and higher-order derivatives of native Julia functions (or any callable object, really). While performance can vary depending on the functions you evaluate, the algorithms implemented by ReverseDiff generally outperform non-AD algorithms in both speed and accuracy.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 4
    Enzyme.jl

    Enzyme.jl

    Julia bindings for the Enzyme automatic differentiator

    This is a package containing the Julia bindings for Enzyme. This is very much a work in progress and bug reports/discussion is greatly appreciated. Enzyme is a plugin that performs automatic differentiation (AD) of statically analyzable LLVM. It is highly-efficient and its ability perform AD on optimized code allows Enzyme to meet or exceed the performance of state-of-the-art AD tools.
    Downloads: 2 This Week
    Last Update:
    See Project
  • Simple, Secure Domain Registration Icon
    Simple, Secure Domain Registration

    Get your domain at wholesale price. Cloudflare offers simple, secure registration with no markups, plus free DNS, CDN, and SSL integration.

    Register or renew your domain and pay only what we pay. No markups, hidden fees, or surprise add-ons. Choose from over 400 TLDs (.com, .ai, .dev). Every domain is integrated with Cloudflare's industry-leading DNS, CDN, and free SSL to make your site faster and more secure. Simple, secure, at-cost domain registration.
    Sign up for free
  • 5
    VoronoiFVM.jl

    VoronoiFVM.jl

    Solution of nonlinear multiphysics partial differential equations

    Solver for coupled nonlinear partial differential equations (elliptic-parabolic conservation laws) based on the Voronoi finite volume method. It uses automatic differentiation via ForwardDiff.jl and DiffResults.jl to evaluate user functions along with their jacobians and calculate derivatives of solutions with respect to their parameters.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 6
    StructuralEquationModels.jl

    StructuralEquationModels.jl

    A fast and flexible Structural Equation Modelling Framework

    ...We provide fast objective functions, gradients, and for some cases hessians as well as approximations thereof. As a user, you can easily define custom loss functions. For those, you can decide to provide analytical gradients or use finite difference approximation / automatic differentiation. You can choose to mix loss functions natively found in this package and those you provide. In such cases, you optimize over a sum of different objectives (e.g. ML + Ridge). This strategy also applies to gradients, where you may supply analytic gradients or opt for automatic differentiation or mixed analytical and automatic differentiation. ...
    Downloads: 3 This Week
    Last Update:
    See Project
  • 7
    Transformers.jl

    Transformers.jl

    Julia Implementation of Transformer models

    ...Inspired by architectures like BERT, GPT, and T5, the library offers a modular and flexible interface for building, training, and using transformer-based deep learning models. It supports training from scratch and fine-tuning pretrained models, and integrates with Flux.jl for automatic differentiation and optimization.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 8
    Taichi

    Taichi

    Productive, portable, and performant GPU programming in Python

    ...It uses JIT compilation (via LLVM and its runtime TiRT) to offload compute-heavy code to CPUs, GPUs, mobile devices, and embedded systems. With built-in support for sparse data structures (SNode), automatic differentiation, AOT deployment, and compatibility with CUDA, Vulkan, Metal, and OpenGL ES, it empowers disciplines like simulation, graphics, AI, and robotics
    Downloads: 4 This Week
    Last Update:
    See Project
  • 9
    Yao

    Yao

    Extensible, Efficient Quantum Algorithm Design for Humans

    An intermediate representation to construct and manipulate your quantum circuit and let you make own abstractions on the quantum circuit in native Julia. Yao supports both forward-mode (faithful gradient) and reverse-mode automatic differentiation with its builtin engine optimized specifically for quantum circuits. Top performance for quantum circuit simulations. Its CUDA backend and batched quantum register support can make typical quantum circuits even faster. Yao is designed to be extensible. Its hierarchical architecture allows you to extend the framework to support and share your new algorithm and hardware.
    Downloads: 2 This Week
    Last Update:
    See Project
  • Build Securely on AWS with Proven Frameworks Icon
    Build Securely on AWS with Proven Frameworks

    Lay a foundation for success with Tested Reference Architectures developed by Fortinet’s experts. Learn more in this white paper.

    Moving to the cloud brings new challenges. How can you manage a larger attack surface while ensuring great network performance? Turn to Fortinet’s Tested Reference Architectures, blueprints for designing and securing cloud environments built by cybersecurity experts. Learn more and explore use cases in this white paper.
    Download Now
  • 10
    ChainRules.jl

    ChainRules.jl

    Forward and reverse mode automatic differentiation primitives

    The ChainRules package provides a variety of common utilities that can be used by downstream automatic differentiation (AD) tools to define and execute forward-, reverse--, and mixed-mode primitives. The core logic of ChainRules is implemented in ChainRulesCore.jl. To add ChainRules support to your package, by defining new rules or frules, you only need to depend on the very light-weight package ChainRulesCore.jl. This repository contains ChainRules.jl, which is what people actually use directly. ...
    Downloads: 4 This Week
    Last Update:
    See Project
  • 11
    Hasktorch

    Hasktorch

    Tensors and neural networks in Haskell

    Hasktorch is a powerful Haskell library for tensor computation and neural network modeling, built on top of libtorch (the backend of PyTorch). It brings differentiable programming, automatic differentiation, and efficient tensor operations into Haskell’s strongly typed functional paradigm. This project is in active development, so expect changes to the library API as it evolves. We would like to invite new users to join our Hasktorch discord space for questions and discussions. Contributions/PR are encouraged. Hasktorch is a library for tensors and neural networks in Haskell. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    ChainRulesCore

    ChainRulesCore

    AD-backend agnostic system defining custom forward and reverse rules

    ...The ChainRulesCore package provides a light-weight dependency for defining sensitivities for functions in your packages, without you needing to depend on ChainRules itself. This will allow your package to be used with ChainRules.jl, which aims to provide a variety of common utilities that can be used by downstream automatic differentiation (AD) tools to define and execute forward-, reverse-, and mixed-mode primitives.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    DynamicHMC

    DynamicHMC

    Implementation of robust dynamic Hamiltonian Monte Carlo methods

    ...In contrast to frameworks that utilize a directed acyclic graph to build a posterior for a Bayesian model from small components, this package requires that you code a log-density function of the posterior in Julia. Derivatives can be provided manually, or using automatic differentiation. Consequently, this package requires that the user is comfortable with the basics of the theory of Bayesian inference, to the extent of coding a (log) posterior density in Julia. This approach allows the use of standard tools like profiling and benchmarking to optimize its performance.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    Shumai

    Shumai

    Fast Differentiable Tensor Library in JavaScript & TypeScript with Bun

    ...It provides a high-performance framework for numerical computing and machine learning within modern JavaScript runtimes. Built on Bun and Flashlight, with ArrayFire as its numerical backend, Shumai brings GPU-accelerated tensor operations, automatic differentiation, and scientific computing tools directly to JavaScript developers. It allows seamless integration of machine learning, deep learning, and custom differentiable programs into web-based or server-side environments without relying on Python frameworks. The library supports matrix operations, gradient computation, and tensor conversions with intuitive APIs and near-native speed, thanks to Bun’s low-overhead FFI bindings. ...
    Downloads: 3 This Week
    Last Update:
    See Project
  • 15
    CasADi

    CasADi

    CasADi is a symbolic framework for numeric optimization

    CasADi is a symbolic framework for numeric optimization implementing automatic differentiation in forward and reverse modes on sparse matrix-valued computational graphs. It supports self-contained C-code generation and interfaces state-of-the-art codes such as SUNDIALS, IPOPT, etc. It can be used in C++, Python, or Matlab/Octave. CasADi's backbone is a symbolic framework implementing forward and reverse modes of AD on expression graphs to construct gradients, large-and-sparse Jacobians, and Hessians. ...
    Downloads: 46 This Week
    Last Update:
    See Project
  • 16
    NNlib.jl

    NNlib.jl

    Neural Network primitives with multiple backends

    This package provides a library of functions useful for neural networks, such as softmax, sigmoid, batched multiplication, convolutions and pooling. Many of these are used by Flux.jl, which loads this package, but they may be used independently.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 17
    PyTorch

    PyTorch

    Open source machine learning framework

    PyTorch is a Python package that offers Tensor computation (like NumPy) with strong GPU acceleration and deep neural networks built on tape-based autograd system. This project allows for fast, flexible experimentation and efficient production. PyTorch consists of torch (Tensor library), torch.autograd (tape-based automatic differentiation library), torch.jit (a compilation stack [TorchScript]), torch.nn (neural networks library), torch.multiprocessing (Python multiprocessing), and torch.utils (DataLoader and other utility functions). PyTorch can be used as a replacement for Numpy, or as a deep learning research platform that provides optimum flexibility and speed.
    Downloads: 74 This Week
    Last Update:
    See Project
  • 18
    JAX

    JAX

    Composable transformations of Python+NumPy programs

    ...But JAX also lets you just-in-time compile your own Python functions into XLA-optimized kernels using a one-function API, jit. Compilation and automatic differentiation can be composed arbitrarily, so you can express sophisticated algorithms and get maximal performance without leaving Python.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 19
    PETSc.jl

    PETSc.jl

    Julia wrappers for the PETSc library

    This package provides a low level interface for PETSc and allows combining julia features (such as automatic differentiation) with the PETSc infrastructure and nonlinear solvers.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    DiffEqFlux.jl

    DiffEqFlux.jl

    Pre-built implicit layer architectures with O(1) backprop, GPUs

    DiffEqFlux.jl is a Julia library that combines differential equations with neural networks, enabling the creation of neural differential equations (neural ODEs), universal differential equations, and physics-informed learning models. It serves as a bridge between the DifferentialEquations.jl and Flux.jl libraries, allowing for end-to-end differentiable simulations and model training in scientific machine learning. DiffEqFlux.jl is widely used for modeling dynamical systems with learnable...
    Downloads: 5 This Week
    Last Update:
    See Project
  • 21
    Zygote

    Zygote

    21st century AD

    Zygote provides source-to-source automatic differentiation (AD) in Julia, and is the next-gen AD system for the Flux differentiable programming framework. For more details and benchmarks of Zygote's technique, see our paper. You may want to check out Flux for more interesting examples of Zygote usage; the documentation here focuses on internals and advanced AD usage.
    Downloads: 4 This Week
    Last Update:
    See Project
  • 22
    Tequila

    Tequila

    A High-Level Abstraction Framework for Quantum Algorithms

    Tequila is an abstraction framework for (variational) quantum algorithms. It operates on abstract data structures allowing the formulation, combination, automatic differentiation and optimization of generalized objectives. Tequila can execute the underlying quantum expectation values on state-of-the-art simulators as well as on real quantum devices.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23
    NonlinearSolve.jl

    NonlinearSolve.jl

    High-performance and differentiation-enabled nonlinear solvers

    ...The package includes its own high-performance nonlinear solvers which include the ability to swap out to fast direct and iterative linear solvers, along with the ability to use sparse automatic differentiation for Jacobian construction and Jacobian-vector products. NonlinearSolve.jl interfaces with other packages of the Julia ecosystem to make it easy to test alternative solver packages and pass small types to control algorithm swapping. It also interfaces with the ModelingToolkit.jl world of symbolic modeling to allow for automatically generating high-performance code.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    NeuralPDE.jl

    NeuralPDE.jl

    Physics-Informed Neural Networks (PINN) Solvers

    NeuralPDE.jl is a Julia library for solving partial differential equations (PDEs) using physics-informed neural networks and scientific machine learning. Built on top of the SciML ecosystem, it provides a flexible and composable interface for defining PDEs and training neural networks to approximate their solutions. NeuralPDE.jl enables hybrid modeling, data-driven discovery, and fast PDE solvers in high dimensions, making it suitable for scientific research and engineering applications.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 25
    CoordinateTransformations.jl

    CoordinateTransformations.jl

    A fresh approach to coordinate transformations

    ...Transformations can be easily applied, inverted, composed, and differentiated (both with respect to the input coordinates and with respect to transformation parameters such as rotation angle). Transformations are designed to be light-weight and efficient enough for, e.g., real-time graphical applications, while support for both explicit and automatic differentiation makes it easy to perform optimization and therefore ideal for computer vision applications such as SLAM (simultaneous localization and mapping).
    Downloads: 1 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • 2
  • 3
  • Next