0% found this document useful (0 votes)
5 views

ML Unit 6

Uploaded by

sudarshan2003sk2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

ML Unit 6

Uploaded by

sudarshan2003sk2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Q1) Artificial Neural Network (ANN), Architecture, Types based on layer.

ML Unit 6
ANN is a computational model inspired by the structure and function of the human brain. It consists of
interconnected nodes (neurons) organized into layers. Widely used in image recognition & NLP.
Architecture of ANN: 1)Input Layer: Receives input features. 2)Hidden Layer: Performs computations
and feature transformations through weights, biases, and activation functions. 3)Output Layer:
Produces the final output (classification, regression, etc.). 4)Connections- Neurons layers are connected
by weights, which adjust during training to minimize error. Types of ANN Based on Layers: 1)Single-
Layer Perceptron: Contains one input and one output layer. Can only handle linearly separable data.
2)Multi-Layer Perceptron (MLP): Contains input, multiple hidden layers, and an output layer. Can model
complex, non-linear relationships. 3)Deep Neural Networks (DNN): A type of MLP with many hidden
layers. Used for advanced tasks like image and speech recognition. e.g. Image classification & NLP tasks
Q2)Recurrent Neural Network (RNN). Its Applications.
RNN is a type of neural network designed for sequential data such as time series, speech or text. RNN
have loops in it allowing it to maintain a memory of previous inputs, making it ideal for time-dependent
tasks. In RNN each neuron receives input from current time step also takes into account the hidden
state from previous time step. Example: Task: Predict the next word in a sentence. Input: "The sky is _"
RNN processes previous words ("The", "sky", "is") and predicts "blue" as the next word.
Applications: 1)Natural Language Processing: Tasks like language translation, Text generation,
sentiment analysis. 2)Speech Recognition: Transcribe audio to text. 3)Time-Series Analysis: Stock price
prediction, weather forecasting. 4)Image Captioning: Generate captions for images. 5)Chatbots &
Virtual Assistant- Helps Understand & generate responses to processing user input sequentially.
Q3) Convolutional Neural Network (CNN). Convolution Layer & Hidden layer
CNN is a type of neural network designed for processing grid-like data such as images. It uses
convolutional layers to automatically detect patterns like edges, textures, and shapes. CNNs used in
image recognition, computer vision. Layers of CNN- 1)Convolution Layer: It is heart of a CNN. It applies
filters (small matrices) to input data to detect specific features such as edges, colours or textures. This
layer perform mathematical operation called “convolution” where the filter slides across image,
analysing smaller section. The output is a feature map that highlights important area or pattern
2)Hidden layer- In CNN hidden layer, include multiple convolution layers, pooling layer and sometimes
fully connected layer. This layer works together to execute feature at various level: i)Early hidden layer
detect how low- level feature like edge or corners. ii)Deeper layer detect complex structure like shape
of objects. Example: Task: Classify an image of a cat. 1)Convolution Layer: Detects edges, fur patterns,
etc. 2)Pooling Layer: Simplifies feature maps while retaining important details. 3)Output: Predicts the
class ("Cat"). Applications: Image classification, object detection, face recognition, medical imaging.
Q4) Activation Function in Neural Networks. Any two functions with example.
It is a mathematical function used in ANN to determine whether neuron should be activated or not. It
introduces non-linearity into a neural network, enabling it to learn and model complex relationships. It
determines the output of a neuron based on its input. Types- 1) Sigmoid- It maps input values to range
between 0 to 1, making it useful for binary classification tasks. Formula = 𝜎(𝑥)=1/ 1+𝑒 ^-𝑥. Example- If
input value to neuron is x= 2, sigmoid function will output- 𝜎(2)=1/ 1+𝑒 ^-2= 0.88.
2)ReLU ( Rectified Linear Unit) Function- It outputs the input directly if it is positive; otherwise it inputs
0. Used in deep learning. Formula- ReLU(x)=max(0,x). Example- If input is x= -3. output is
ReLU ( -3)= max ( 0,-3)= 0. if input is x= 4. output is ReLU ( 4)= max ( 0,4)= 0. 3) Linear: Used in output
layers for regression tasks. 𝑓(𝑥)=𝑥. 4) Tanh: Outputs between -1 and 1. f(x)=tanh(x).
Q5) Back propagation Network .Example and Characteristics ML Unit 6
It is a type of neural network that uses the backpropagation algorithm to train its weights. It minimizes
the error between predicted and actual outputs by adjusting weights through gradient descent. It is
Supervised learning algorithm. Process involves two main steps- 1) Forward Propagation- Input is
passed through network, layer by layer to produce an output. 2)Backward propagation- The error is
propagated backward through network to update weights. Characteristics- 1) Supervised Learning:
Requires labelled data for training. 2)Iterative Process: Updates weights iteratively until error
converges.3)Differentiable Activation Functions: Works with functions like sigmoid, ReLU, or tanh.
4)Gradient-Based: Uses gradients to minimize the loss function. 5)Error Calculation- By comparing
networks output with actual target values. 6)Layered Architecture- Consists of input layer one or more
hidden layer & output layer. 7)Learning rate- Control how much weights are updated during each
iteration. Example- Task: Predict the price of a house using features like area and location.
Process: 1)Forward pass- calculates the predicted price. 2)Backpropagation- adjusts weights based on
the error between the predicted and actual prices.
Q6)Functional Link Artificial Neural Network (FLANN). Its merits over ANNs.
FLANN is type of single-layer neural network that enhances input features by expanding them into
higher- dimensional feature space. This eliminates the need for hidden layers. Key Features:
Transforms inputs using functions like polynomials, trigonometric expansions. Performs non-linear
mapping without requiring hidden layers. Merits of FLANN over ANNs- 1)Simpler Architecture: No
hidden layers, reducing complexity. 2)Faster Training: Requires fewer parameters, leading to quicker
convergence. 3)Effective for Non-Linear Problems: Feature expansion handles non-linearity efficiently.
4)Reduced Computational Cost: Less computation compared to multi-layer networks.
Q7)Radial Basis Function (RBF) Networks. Building blocks of RBF Networks
An RBF Network is a type of artificial neural network that uses radial basis functions as activation
functions. It is typically used for regression, classification, and time-series prediction. Idea is to measure
similarity between input & some centre points using RBF. Building blocks of RBF- 1)Input Layer: Passes
input features to the network.2)Hidden Layer: Uses RBF such as Gaussian function to compute
similarity of input to centres. Computes the distance between the input and the centre.
3)Output Layer: Combines the weighted outputs of the hidden layer to produce the final result.
Advantages- 1)RBF network can model complex, non-linear relationship in data effectively. 2)Perform
well with smaller datasets.
Q8) Convolutional Neural Network (CNN). Convolution Layer & Hidden layer CNN is a type of neural
network designed for processing grid-like data such as images. It uses convolutional layers to
automatically detect patterns like edges, textures, and shapes. CNNs used in image recognition,
computer vision. Layers of CNN- 1)Convolution Layer: It is heart of a CNN. It applies filters (small
matrices) to input data to detect specific features such as edges, colours or textures. This layer perform
mathematical operation called “convolution” where the filter slides across image, analysing smaller
section. The output is a feature map that highlights important area or pattern 2)Hidden layer- In CNN
hidden layer, include multiple convolution layers, pooling layer and sometimes fully connected layer.
This layer works together to execute feature at various level: i)Early hidden layer detect how low- level
feature like edge or corners. ii)Deeper layer detect complex structure like shape of objects. Example:
Task: Classify an image of a cat. 1)Convolution Layer: Detects edges, fur patterns, etc. 2)Pooling Layer:
Simplifies feature maps while retaining important details. 3)Output: Predicts the class ("Cat").
Applications: Image classification, object detection, face recognition, medical imaging.

You might also like