1. 22
  1. 5

    Normally these kinds of things are basically from NeuralNetworkLibrary import nn, but in this case it’s actually the whole dang thing!

    1. 2

      And it’s actually from a few years ago. Marc Cournoyer, the author, did a similar presentation with Ruby more recently. I’d didn’t see it, but I know he’s really good at condensing things to their essential form.

    2. 1

      I’ve been meaning to do just that!

      I tried to do so without using even numpy, just stdlib packages.

      I’ve been busy lately, so as I got stuck trying to put backprop into code (I do understand it, I just haven’t figured out how to elegantly put it into code, for now) I left it aside.

      My code, as of today:

      (∀x; x is value, X is 1d vector, XX is 2d vector)

      from math import exp
      from random import uniform
      
      
      class Neuron(object):
      
        def __init__(self, n):
          self.weights = [uniform(-1, 1) for i in range(n)]
          self.bias = 0.
      
        def run(self, X):
          assert len(X) == len(self.weights)
          return sum([X[i]*self.weights[i] for i in range(len(X))]) + self.bias
      
      
      class Layer(object):
      
        def __init__(self, w, h):
          self.neurons = [Neuron(w) for i in range(h)]
          self.activation = lambda X: [1/(1+exp(-x)) for x in X]
      
        def run(self, XX):
          assert len(XX) == len(self.neurons)
          return self.activation([self.neurons[i].run(XX[i]) for i in range(len(XX))])
      
      
      class Network(object):
      
        def __init__(self):
          self.layers = []
          self.cost = lambda X, Y : [(x-y)**2 for x, y in zip(X,Y)]
      
        def add(h):
          self.layers.append(Layer(len(self.layers[-1].N) if self.layers else 1, h))
      
        def fit(self, XX, Z)
          assert len(XX) == len(Z)
      
          for i in range(len(self.layers)-1):
            X.append(self.layers[i].run(XX))
            XX = [X[-1]] * len(self.layers[i+1].N)
      
          for i in range(len(self.layers)):
            self.layers[i].fit(self.cost(Z,Y))
      
        def run(self, XX):
          for i in range(len(self.layers)-1):
            XX = [self.layers[i].run(XX)] * len(self.layers[i+1].N)
          return self.layers[-1].run(XX)
      

      Thanks for sharing this!