How Perceptrons solve the linearly separable problems

Priyansh Kedia
2 min readMay 16, 2021

--

A perceptron is a single neuron model that may be a forerunner to a large network.

Perceptrons perform the computations to output binary values 0 or 1 (the output values can also be -1 and 1 depending on the activation function used).

What is a linearly separable problem?

Linear separability is a property of two sets of points. This is most easily visualized in two dimensions (the Euclidean plane) by thinking of one set of points as being colored blue and the other set of points as being colored red. These two sets are linearly separable if there exists at least one line in the plane with all of the blue points on one side of the line and all the red points on the other side.

Image Courtesy: https://en.wikipedia.org

How does a single perceptron solve a linearly separable problem?

As we know that a single perceptron consists of inputs, weights, bias, and an activation function to serve the output.

Let us assume the output of the perceptron to be y, and let x be an input to the perceptron with a single input.

Then y = w1*x + b, where w1 is the weight for the input, and b is the bias.

Let us further simplify this problem by assuming w1 to be 1, and b to be 0.

This makes our equation to be y = x, which is the equation of the straight line passing through the origin.

The above image shows red and green labels, which indicate 2 classes in the problem.

As we can see the equation we defined above can easily separate the two kinds of classes.

But what if the problem is not linearly separable?

A single perceptron fails to solve the problem which is linearly inseparable.

As we saw, that a single perceptron is capable of outputting a linear equation in the form of a model.

So to solve a non-linear problem, we add multiple perceptrons to the network.

One such problem is the XOR problem, which leads to multilayered perceptrons.

You can read about Multilayered perceptrons here.

In this blog, we read about how perceptrons are capable of solving simple problems, and how they are useful in the construction of a neural network to solve complex problems.

--

--

Responses (1)