Greataptic
Just like life evolves, neural networks do too!
...and neural network libraries too!
Greataptic is a minimalistic, sequential-only, simple and flexible Node.JS library that was created after the abandoning of Neataptic. It is probably extensible (but lacks a proper extension/plugin system), and it does have a simple and efficient genetic algorithm.
Tutorials
Quickstart
First, you create a simple neural network.
const greataptic = ; let net = greataptic;
In this example, this network will return whether we should go to the park, given whether it is raining, and the temperature in Celsius.
To make our life easier, we can just define a function that returns whether we should go to the park, given these parameters:
{ let l = netdata; if raw return l; return l0 > l1;}
Now, all we have to do is to train it.
net = net;
Once we have done training it, we can use our function to our heart's desire!
> true> false> false> false> false> false> false> true> false
Spiking Neural Networks
A digital spiking neural network is a concept that is similar to the ordinary linear neural network, but with the added nuance that it can vary temporally. It sacrifices the bloody scalarity into 1's and 0's, but in turn it will likely not have the same result for the same input on every single activation.
Basically, every neuron has a threshold and an accumulator. If this accumulator reaches a specific threshold, it can release a specified output signal, that will add to (or subtract from) the accumulator of other neurons.
For example, let's say we want to create a random word from noise: the stronger the signal from a specific letter, the earlier it comes in this word, and if this signal is too weak, it will not appear at all.
First, we have to initialize our spiking neural network. To use spiking layers,
we must provide the type: 'spiking'
to the object describing the layer in the network
construction utility sequential
.
const greataptic = ; // A few adjustable constants to make sure our word is the right size.const signalThreshold = 06;const noiseSize = 9; const alphabet = 'abcdefghijklmnopqrstuvwxyz';let net = greataptic;
Now we can make ourselves an utility function to make our lives easier - "do what you want 'cause a 'coder' is free! You're the programmer!"
{ return size;} { let letters = netdata; return letters ;}
At first, it will be fully random:
> 'oxjkmie'> 'boxjkmz'> 'poxjkmiywe'> 'plxjckmie'> 'oxjkmiz'> 'poxjkmfiye'> 'oxjkmie'> 'oxjkmiz'> 'boxjkmz'> 'lxkmie'> 'plxjckmie'
Since it is a completely generative approach, at this point, our best way to train this network is most likely through a simple but efficient technique called...
Generative Adversarial Network (GAN)
Remember our previous code? We can upgrade it using GANs.
In a GAN, there are two neural networks opposing each other: the generator, which takes in a random noise vector and outputs what we want (well, should), and a discriminator, that will tell the generator how 'true' its output looks. The discriminator learns to differentiate between real and false data using two sets (a set of actual, true data, and a set of fake data generated by the generator in a previous iteration), and the generator learns simultaneously given the rating the discriminator gives to the generator.
Unlike other GAN libraries, this rating is computed through a fitness callback function and fed into a genetic algorithm. This way, the truer a generator network is, the better it will rank out, which can end up breeding (and mutating) better, robuster networks, a la Darwian.
So, how can we use this powerful technology?
First, scrap that greataptic.sequential
function. We're
going to take a look at a class called greataptic.GAN
!
(And we can also use the English word array library to get
the real data our GAN will require.)
// @sequential GET OFF MY SWAMP// let net = greataptic.sequential(5, [ {size: noiseSize, type: 'spiking'}, {size: 23, type: 'spiking'}, {size: alphabet.length, type: 'spiking', post: 'sigmoid'} ]);
Did that? Good. Now we can begin writing the file.
Begin with the useful constants:
const greataptic = ;const shuffle = ;const realWords = ; // A few adjustable constants to make sure our word is the right size.const signalThreshold = 06;const noiseSize = 15;const netType = 'linear';const alphabet = 'abcdefghijklmnopqrstuvwxyz';
Take note of the fact we're back to good old linear neural networks. Spiking neurons and generativity are concepts that don't seem to ring well together.
Now, declare some functions so you can encode and decode words into arrays:
{ let bad = Math; let d = greatapticlayerTypessigmoid; return d;} { return letters ;}
Good! NOW comes the interesting part.
Initialize the GAN thingie, providing the right arguments. We'll look into those once I finish the API documentation.
console;const realEncodedWords = realWords; let gan = size: output: alphabetlength noise: noiseSize outputType: netType;
I guess this is a bit more code than our previous examples, but hey, I'm sure it's totally worth it!
Now you can begin training! Hooray!
console;gan;
Also take note of the fact we are using the .trainAsync
function, which is
supposedly faster.
This might take a long time. We have profiled the application, but if it takes too long for you to train the GAN, there is probably not a lot that can be done at that point. I'm sorry.
... [Generation #1] Best fitness: 0 | Worst fitness: 0(hvwycfixnzl, nhvrwikmc, ivowycxjgn, tzfeigmknv, fgheovkiwxc, hwznflkgto, vhegyksq, yfhwesgvkci, ...)[Generation #2] Best fitness: 0.6310722878538669 | Worst fitness: 0.06214089501871067(hoefclxv, vhyfwglnm, ivofwyqgxlth, ztfegvim, xkfzvohnjei, hozqnfwal, vgfhkei, hyesfvwicgqlpk, ...)[Generation #3] Best fitness: 0.7720685810330974 | Worst fitness: 0.02534786537134085(fvoehzcxukn, efmvdzcn, oigvyftxwnqjk, hxfeyloqz, mqhwfoisenz, hzftogpume, hcesfdt, exzbfmgwov, ...)[Generation #4] Best fitness: 0.9099892193256698 | Worst fitness: 0.011222608262300893(hcdest, tzmfgavdqne, fizvetgkmb, joemitwafzguh, uefxvckniat, efszwkoh, feucvzhdtonwxslq, fokezcjqtm, ...)
At first it won't seem to improve much, but hey, it's a GAN.
Once you're done, go Wild Words, cowboy! .......yes? no? But, come on, that was a good pun!
You know what else is good? This beautiful output:
(duatzrjpvmyfe, uzdmaprtjlyc, jtudzmoyralpwb, uajtoylrdpzbwq, jdrpxyltfnusvz, jdaoupyftvlx, urtdzyapmjf, udjtaylrpmoz, ...)