Gauss Markov Theorem: OLS is BLUE!

The Gauss-Markov theorem states that if your linear regression model satisfies the classical assumptions, then ordinary least squares (OLS) regression produces best linear unbiased estimates (BLUE) that have the smallest variance of all possible linear estimators. There are five Gauss Markov assumptions: Linearity: the parameters we are estimating using the OLS method must be themselves linear. Random:…

Continue reading

Gradient Descent

Gradient Descent is an optimization algorithm used to find the values of the parameters of any function that minimizes the cost function. The average difference of the squares of all the predicted values of y and the actual values of y is called a Cost Function. It is also called…

Continue reading

Heteroscedasticity

A random variable is said to be heteroscedastic when different subpopulations have different variabilities (standard deviation). One of the basic assumptions of linear regression is that the data should be homoscedastic, i.e., heteroscedasticity is not present in the data. Due to the violation of assumptions, the Ordinary Least Squares (OLS) estimators…

Continue reading

Linear Regression

Linear regression is a supervised learning algorithm in machine learning (ML). I would like to say it is the starting point of anyone’s ML journey! Regression is a method of modelling a target value based on independent predictors. This method is mostly used for forecasting and finding out the cause…

Continue reading