Close

# Scikit Learn Perceptron

I HAVE MADE A NICER VERSION OF THIS HERE.

I’ve been playing about with the Perceptron in SciKit Learn but was having trouble getting to to accurately solve a linear separability problem. The problem is clearly solvable and works in Matlab, however I could not get it to work in Python. Anyways whilst writing this post, originally title ‘please help me’ I had an idea, I tested it and it worked.

So now we have a linear separability example using a single perceptron.

### Plot the Data (see what it looks like)

See, the data is clearly separable. Matlab gets it without any problems.

### Work some magic

Here was the original problem, the data rotation was wrong. Originally only had 1 rotation (90 degrees) meaning the data (d or d90) wasn’t matching the labels (t).

As we can see, the model has solved the problem.

### Test with more data

First we create some random data in NumPy.

### Predication Time

So now we have the model (called net) we can run the new dummy data through it to make it use a prediction. (At some point I will create a post on the maths behind this).

You should be looking at lots of zeros and ones, that is the predication. The values are either a 0 or a 1, or another way of thinking about it Group A(0) or Group B(1).

Lets plot the new data and the predictions onto a scatter graph.

Lets have a look…

Fantastic. So what are we looking at? Simple, the colour represents the group so anything to the right of the line should be black and anything to the left should be red. And it is…!

### Summary

This is a starting point for machine learning. A single perceptron based on a neuron which has been trained using a supervised method and dataset.

So what next I hear you cry, well next would be following two topics both leading to the same place. Non-Linear Problems and Multi Layer Perceptrons.

A nice example of a Non-Linear Problem can be seen in the XOR as plotted below, you cannot drawn a single straight line to separate the two colours.