LSTM(long short term memory)

LSTM networks are an extension of recurrent neural networks (RNNs). They can remember the information for a long period of time, unlike RNNs. In RNNs, there is no finer control over which part of the context needs to be carried forward or ‘forgotten’. The other issues with RNNs occur during…

Continue reading

Backpropagation

The training samples are passed through the network and the output obtained from the network is compared with the actual output. This error is used to change the weights of the neurons such that the error decreases gradually. This is done using the Backpropagation algorithm, also called backprop. Iteratively passing batches of…

Continue reading

FeedForward Algorithm

In a feedforward neural network, the output from one layer is used as input to the next layer. Forward propagation means movement happens only from input to the output. This means there are no loops in the network and information is always fed forward.  Input data passes into a layer where…

Continue reading

Activation Functions

An activation function helps a neural network to learn complex relationships and patterns in data. It takes in the output signal from the previous cell and converts it into some form that can be taken as input to the next cell. The activation function introduces non-linearity into the output of a neuron….

Continue reading

Artificial Neural Network

Artificial Neural networks(ANN’s) are the base or functional unit of deep learning. A neural network emerged from a very popular machine learning algorithm named perceptron.  A Neuron is the basic unit of computation in a neural network. It is also called as a node or unit. The leftmost layer in this network is…

Continue reading

Perceptron

Perceptron is an algorithm for supervised learning of binary classifiers. A binary classifier is a function that can decide whether or not an input, represented by a vector of numbers, belongs to some specific class. Perceptron is the simplest type of neural network model. It is a model of a…

Continue reading

What is Deep Learning?

Deep learning is a subset of machine learning. Deep learning is inspired by the human brain. Deep learning is based on artificial neural networks. The “deep” in deep learning refers to the number of layers through which the data has to go through before the output layer. Neural networks use a…

Continue reading

Support Vector Machines

A support vector machine (SVM) is a supervised machine learning model which can be used for both classification and regression. But they have been extensively used for solving complex classification problems such as image recognition, voice detection etc. SVM algorithm outputs an optimal hyperplane that best separates the tags. The hyperplane is a boundary that…

Continue reading

Confusion Matrix

A confusion matrix is a fundamental tool in the field of machine learning and data science, often used to assess the performance of classification models. It provides a detailed breakdown of the model’s predictions compared to the actual ground truth, allowing us to evaluate various aspects of model performance. The…

Continue reading

Correlation vs Causation

Introduction In the quest to understand relationships between variables, two terms consistently surface correlation and causation. Despite their apparent similarity, they have different implications and uses. This distinction is more than just a technicality; it’s a fundamental concept that every data analyst or scientist needs to grasp. The Basics of…

Continue reading