Page 190 - AI Computer 10
P. 190
Basic Components of Perceptron
A Perceptron is composed of key components that work together to process information and make predictions.
u Input Features: The perceptron takes multiple input features, each representing a characteristic of the
input data.
u Weights: Each input feature is assigned a weight that determines its influence on the output. These weights
are adjusted during training to find the optimal values.
u Summation Function: The perceptron calculates the weighted sum of its inputs, combining them with their
respective weights.
Inputs
Weights
X 3
W 3
Weighted Activation
X 2 Sum Function Output
W 2
Σ ƒ Y
W
X 1 1
W 0
X 0
u Activation Function: The weighted sum is passed through an activation function, that compares it to a
threshold value to produce a binary output (1 for success or 0 for failure).
u Output: The final output is determined by the activation function, often used for binary classification tasks.
u Bias: The bias term helps the perceptron make adjustments independent of the input, improving its flexibility
in learning.
u Learning Algorithm: The perceptron adjusts its weights and bias using a learning algorithm to minimize
prediction errors.
These components enable the perceptron to learn from data and make predictions. While a single perceptron
can handle simple binary classification, complex tasks require multiple perceptrons organized into layers, forming
a neural network.
Kno
Knowledge Botwledge Bot
Threshold value is used to make binary classifi cation decisions based on the weighted sum of inputs.
How does Perceptron work?
A weight is assigned to each input node of a perceptron, indicating the importance of that input in determining
the output. The perceptron’s output is calculated as a weighted sum of the inputs, which is then passed through
an activation function to decide the outcome.
The weighted sum is computed by the formula:
Z= W1 × X1 + W2 × X2 + … + Wn × Xn + b,
Where, X1, X2, …, Xn are the inputs, W1, W2, …, Wn are the respective weights, and b is the bias.
56
56