User:Alexander Roidl/neuralnetsanddeeplearning

From XPUB & Lens-Based wiki

Machine Learning

Machine Learning is a term used to describe certain algorithms that »learn«. But learning can be seen as a quite broad term, for some applications it really comes down to a statistical calculation

Terms:

  • Generalisation
  • Overfitting
  • Unterfitting
  • Features

K–Nearest Neighbour

K = Variable

Plots Features as Vectors and sorts them in space > Clusters them

The new datapoint is also put in this space and the K nearest points are used to categorise the new point.

Linear models

Draw a linear border between 2 or more classes

Decision Tree

Cain of if / else statements


Neural Networks

Same as Deeplearning

Uses different Layers and a networked structure to find the best values for the wanted result.

Feedforward -> Backpropagation …

It consists of a lot of so called neurons.

There are different kind of neurons:

  • Perceptron: outputs 0 or 1
  • Sigmoid: outputs a value between 0 and 1: f(x) = 1/1+e^(-x)
  • ReLU: same as Sigmoid but cuts values: -1<x>1


So if you take different or multiple of those networks you end up with one of the following popular concepts:

LSTM: Long Short-Term Memory

Stores last results in a neural network itself

Good for text-generation (prevents repeating)

Recurrent Neural Networks

CNN: Convolutional Neural Networks

Especially for images

GAN: Generative Adversarial Network

Uses two neural networks that oppose each other. One generates random data, the other one figures if it is fake or real. So the first one is trained to fool the discriminator.