Tensor-Evolution is a library for evolving neural network topology using a genetic algorithm. This library currently
uses Deap as its evolutionary backend, and Tensorflow
for the neural networks. Ray is used for parallel execution.
Note that this library doesn't build networks a single neuron at a time, the basic building blocks are entire layers.
Population members start as the input layer connected directly to the output layer. Mutation operators exist for inserting layers (from a list of supported types), deleting layers, and for mutating an existing layer's properties. A crossover operator is also implemented.
Fitness is evaluated by building, compiling, and training a model from each population member's genome. Training is done the standard way (i.e. via backpropagation, not through any evolutionary means).
Note that most layer types can be added amost anywhere in the genome. If the input shape isn't right, it's corrected (attempts are made to correct it intelligently, but if required it's forced to fit).
This list is currently expanding. So far:
- Dense
- ReLu
- Conv2D, 3D
- Maxpool2D, 3D
- Addition
- BatchNorm
- Flatten
- LSTM
- GlobalAvgPooling 1D
- Embedding
- Concat
pip install tensor-evolution
Start by importing the tensor_evolution module. This is the main driver for the evolution.
import tensorEvolution
Next, prepare your data as a tuple of four objects, like so:
data = x_train, y_train, x_test, y_test
Then create an evolution worker, and use that worker to drive the evolution:
worker = tensor_evolution.EvolutionWorker()
worker.evolve(data=data)
Please reference the end to end examples for full details.
Everything is configured via yaml file.
For example, to change population size to 30:
####
# Evolution Controls
####
...
pop_size: 30 #population size
Mutation rates, valid neural network layer types, input and output shapes, etc. are all controlled from the config file.
Very much still a work in progress, (as is this readme), but it is functional. The mnist, autompg, and text classification examples runs just fine.
The best individual after running MNIST with a population of 20 individuals for 10 generations:
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_4 (InputLayer) [(None, 28, 28)] 0
reshape (Reshape) (None, 28, 28, 1) 0
conv2d (Conv2D) (None, 28, 28, 16) 272
conv2d_1 (Conv2D) (None, 28, 28, 8) 1160
flatten (Flatten) (None, 6272) 0
dense (Dense) (None, 10) 62730
=================================================================
Total params: 64,162
Trainable params: 64,162
Non-trainable params: 0
The best individual after running Auto MPG with a population of 100 individuals for 20 generations:
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 9)] 0 []
dropout (Dropout) (None, 9) 0 ['input_1[0][0]']
add (Add) (None, 9) 0 ['input_1[0][0]',
'dropout[0][0]']
dense (Dense) (None, 256) 2560 ['add[0][0]']
flatten (Flatten) (None, 256) 0 ['dense[0][0]']
dense_1 (Dense) (None, 1) 257 ['flatten[0][0]']
==================================================================================================
Total params: 2,817
Trainable params: 2,817
Non-trainable params: 0
__________________________________________________________________________________________________
Evaluation Results
3/3 [==============================] - 0s 0s/step - loss: 1.5367 - mean_absolute_error: 1.5367