This is my implementation of an ANN from scratch in C++. It was able to classify the MNIST dataset with 93% accuracy.
The user facing parts of the library center around the Model class. During instantiation a model object can take a number of activation functions and optimizers, as well as the learning rate. After instantiation the networks topology is formed by adding successive layers to the model, the topology can be viewed with the model.infoLayers() method. Finally the model can be trained with model.teach() which takes a vector for labels, and a vector< vector> for the images. After training takes place the trained model can be be tested with the model.predict() method which takes the same parameters as the aforementioned model.teach() method.
The available loss functions can be seen in the LossFuncsFactory.cpp; at present the supported loss functions are cross_entropy and softmax. The activation functions available can be found in ActivationFuncsFactory.cpp, with sigmoid, RELU, and tanh supported.