Skip to content

An user friendly rust library for building feedforward neural networks.

License

Notifications You must be signed in to change notification settings

thiagobmi/rusty_net

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

octologo_rusty_net_1730434967

Overview

rusty_net is a user-friendly Rust library for building feedforward neural networks. It generates fully connected, multi-layer neural networks that are trained using backpropagation.

The library allows for easy configuration of key parameters, including momentum, learning rate, and halt conditions, simplifying the training process.

// Creating a network with:
// 2 nodes in input layer,
// two hidden layers, with 3 and 5 nodes,
// and an output layer with 2 nodes.
let mut nn = NN::new(&vec![2, 3, 5, 2]); 

nn.train(&examples) 
        .rate(0.1) // Configuring learning rate
        .momentum(0.9) // Configuring momentum
        .log_interval(Some(100)) // Network will log the error each 100 epochs
        .halt_condition(Epochs(5000)) // Setting halt condition (stop after 5000 epochs)
        .go(); // Starts the training

image

Additional Parameters

This library offers advanced options for configuring your neural network, such as activation functions and loss functions. These parameters provide flexibility to tailor your model to specific requirements.

// Set the activation function
let mut nn = NN::new(&vec![2, 2, 1]);
nn.activation(rusty_net::ActivationFunction::LeakyReLU);

// Set the loss function
nn.train(&examples)
        .loss_function(rusty_net::LossFunction::CrossEntropy)
        .go();

Activation functions:

  • Sigmoid
  • ReLU
  • Leaky ReLU
  • TanH

Loss functions:

  • Mean Squared Error (MSE)
  • Cross Entropy

Saving

You can save the weights and biases of your network in a json file by simply using:

nn.save_as_json("nn.json");

The data can then be loaded with:

let nn = NN::load_from_json("nn.json");

AND example

This example initializes a neural network with an input layer of 2 nodes, a single hidden layer with 3 nodes, and an output layer containing 1 node. The network is trained on examples of the AND function. After calling train(&examples), additional methods are used to configure training options, though these are optional. Training begins when the go() method is called, prompting the network to learn from the provided examples.

use rusty_net::{NN, HaltCondition};

// create examples of the AND function
// the network is trained on tuples of vectors where the first vector
// is the inputs and the second vector is the expected outputs
let examples = [
    (vec![0f64, 0f64], vec![0f64]), // 0 AND 0 = 0
    (vec![0f64, 1f64], vec![0f64]), // 0 AND 1 = 0
    (vec![1f64, 0f64], vec![0f64]), // 1 AND 0 = 0
    (vec![1f64, 1f64], vec![1f64]), // 1 AND 1 = 1
];

// create a new neural network by passing a pointer to a vector
// that specifies the number of layers and the number of nodes in each layer
let mut net = NN::new(&vec![2, 3, 1]);

// train the network on the examples of the AND function
net.train(&examples)
    .halt_condition(HaltCondition::Epochs(1000))
    .log_interval(Some(100))
    .momentum(0.1)
    .rate(0.3)
    .go();

// evaluate the network to see if it learned the AND function
for &(ref inputs, ref outputs) in examples.iter() {
    let results = net.run(inputs);
    let (result, key) = (results[0].round(), outputs[0]);
    assert!(result == key);
}

About

An user friendly rust library for building feedforward neural networks.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages