neural-network.ts
TypeScript icon, indicating that this package has built-in type declarations

1.0.3 • Public • Published

neural-network.ts

Lightweight TypeScript neural network library with back-propagation and genetic algorithms, supporting a wide range of activation functions, all without extra build or native dependencies.

npm Package Version Minified Package Size Minified and Gzipped Package Size

Features

  • Duel Mode Training

    • Train with back-propagation (Supervised Learning)
    • Evolve neural network with genetic algorithm (Reinforcement Learning)
  • Neural Network Export

    • Compile into standalone JavaScript functions
    • Import/export in JSON format
  • Activation Functions

    • Supports a wide range of activation functions: sigmoid, centered sigmoid, tanh, normalized tanh, linear, ReLU, ELU
    • Automatic calculation of derivatives for custom activation functions
  • Lightweight and Easy to Install

    • No need to install or build node-gyp, CMake, CUDA, TensorFlow, Python, etc.
    • Only depends on ga-island, which is 1.4 KB gzipped and has no additional dependencies
  • Typescript / JavaScript Compatibility

    • Optional TypeScript support
    • Support usage from Javascript without further build/bundle steps
  • Isomorphic Package

    • Works in both Node.js and browser environments

Installation

Install with CDN:

<script src="https://cdn.jsdelivr.net/npm/neural-network.ts@1/dist/browser.js"></script>

All library functions are available in the global neural_network object.

Install with package manager:

npm install neural-network.ts

You can also install neural-network.ts with pnpm, yarn, or slnpm

Usage Example

More examples see nn.test.ts, ga.test.ts, and color.test.ts.

Back-propagation with Training Data

import { random_network, learn } from 'neural-network.ts'

let network = random_network({
  layers: [
    { size: 2, activation: linear },
    { size: 2, activation: tanh },
    { size: 1, activation: sigmoid },
  ],
})
let inputs = [
  [0, 0],
  [1, 0],
  [0, 1],
  [1, 1],
]
let targets = [[0], [1], [1], [0]]
let sample_size = inputs.length

let epochs = 2000
let learning_rate = 0.1

for (let epoch = 1; epoch <= epochs; epoch++) {
  let mse = 0
  for (let i = 0; i < sample_size; i++) {
    mse += learn(network, inputs[i], targets[i], learning_rate)
  }
  mse /= sample_size
  console.log(epoch, mse)
}

Genetic Algorithm without Explicit Training Data

import { create_ga, best, forward } from 'neural-network.ts'

let ga = create_ga({
  spec: {
    layers: [
      { size: 2, activation: linear },
      { size: 2, activation: sigmoid },
      { size: 1, activation: sigmoid },
    ],
  },
  fitness: network => {
    let diff = 0
    diff += (forward(network, [0, 0])[0] - 0) ** 2
    diff += (forward(network, [1, 0])[0] - 1) ** 2
    diff += (forward(network, [0, 1])[0] - 1) ** 2
    diff += (forward(network, [1, 1])[0] - 0) ** 2
    return -diff / 4
  },
  population_size: 1000,
  mutation_amount: 0.2,
})

let epochs = 500
for (let epoch = 1; epoch <= epochs; epoch++) {
  ga.evolve()
  let { fitness } = best(ga.options)
  let mse = -fitness
  console.log(epoch, mse)
}

Typescript Signature

Neural Network Definition API
/**
 * @description
 * - Must have at least 2 layers (input layer and output layer).
 * - Input layer must use linear activation.
 */
export function random_network(options: NetworkSpec): Network

/**
 * @description shortcut for network that use the same activation for all layers
 * (except input layer which must use linear activation)
 */
export function to_network_spec(options: {
  sizes: number[]
  activation: Activation
}): NetworkSpec

export type Network = {
  /** @description layer -> output -> input -> weight */
  weights: number[][][]
  /** @description layer -> output -> bias */
  biases: number[][]
  /**
   * @description layer -> activation
   * @example sigmoid
   */
  activations: Activation[]
}

export type Activation = (x: number) => number

export type NetworkSpec = {
  /** [input_layer, ...hidden_layer, output_layer] */
  layers: LayerSpec[]
}

export type LayerSpec = {
  size: number
  activation: Activation
}
Neural Inference and Training API
export function forward(network: Network, inputs: number[]): number[]

export function learn(
  network: Network,
  inputs: number[],
  targets: number[],
  /** @example 0.2 or 0.01 */
  learning_rate: number,
): number
Genetic Algorithm API
import { GaIsland } from 'ga-island'

export function create_ga(args: {
  spec: NetworkSpec
  fitness: (network: Network) => number
  /** @example 0.2 */
  mutation_amount: number
  /**
   * @description should be even number
   * @default 100
   * */
  population_size?: number
}): GaIsland<Network>

/**
 * @description convert sample data to fitness function.
 * In case you really want to use GA instead of back-propagation to train the network.
 */
export function sample_to_fitness(args: {
  inputs: number[][]
  targets: number[][]
}): (network: Network) => number
Activation Functions and Helper Functions
export let fn: {
  sigmoid: typeof sigmoid
  centered_sigmoid: typeof centered_sigmoid
  tanh: typeof tanh
  normalized_tanh: typeof normalized_tanh
  linear: typeof linear
  relu: typeof relu
  elu: typeof elu
}
export let fn_derivative: Map<Activation, Activation>

export function sigmoid(x: number): number
export function sigmoid_prime(x: number): number
export function centered_sigmoid(x: number): number
export function centered_sigmoid_prime(x: number): number
export function tanh(x: number): number
export function tanh_prime(x: number): number
export function normalized_tanh(x: number): number
export function normalized_tanh_prime(x: number): number
export function linear(x: number): number
export function linear_prime(x: number): number
export function relu(x: number): number
export function relu_prime(x: number): number
export function elu(x: number): number
export function elu_prime(x: number): number

export function get_derivative(activation: Activation): Activation

/**
 * @description calculate the derivative of activation function at x by sampling with small step.
 */
export function derivative(activation: Activation, x: number): number

export function random_between(min: number, max: number): number
export function random_around_zero(range: number): number
Neural Network Export API
export interface CompiledNetwork {
  (inputs: number[]): number[]
}

export function compile(network: Network): CompiledNetwork

export type NetworkJSON = {
  weights: number[][][]
  biases: number[][]
  activations: (
    | 'sigmoid'
    | 'centered_sigmoid'
    | 'tanh'
    | 'normalized_tanh'
    | 'linear'
    | 'relu'
    | 'elu'
  )[]
}

export function to_json(network: Network): NetworkJSON

export function from_json(json: NetworkJSON): Network

License

This project is licensed with BSD-2-Clause

This is free, libre, and open-source software. It comes down to four essential freedoms [ref]:

  • The freedom to run the program as you wish, for any purpose
  • The freedom to study how the program works, and change it so it does your computing as you wish
  • The freedom to redistribute copies so you can help others
  • The freedom to distribute copies of your modified versions to others

Package Sidebar

Install

npm i neural-network.ts

Weekly Downloads

3

Version

1.0.3

License

BSD-2-Clause

Unpacked Size

75.2 kB

Total Files

7

Last publish

Collaborators

  • beenotung