A random variable is said to be heteroscedastic when different subpopulations have different variabilities (standard deviation).

One of the basic assumptions of linear regression is that the data should be homoscedastic, i.e., heteroscedasticity is not present in the data. Due to the violation of assumptions, the Ordinary Least Squares (OLS) estimators are not the best estimator. Hence, they do not give the least variance than other Linear Unbiased Estimators.

The existence of heteroscedasticity gives rise to certain problems in the regression analysis as the assumption says that error terms are uncorrelated and, hence, the variance is constant. The presence of heteroscedasticity can often be seen in the form of a cone-like scatter plot for residual vs fitted values.

There is no fixed procedure to overcome heteroscedasticity. However, there are some ways that may lead to a reduction of heteroscedasticity.

They are :

- Logarithmising the data: A series that is increasing exponentially often results in increased variability. This can be overcome using the log transformation.
- Using weighted linear regression: Here, the OLS method is applied to the weighted values of X and Y. One way is to attach weights directly related to the magnitude of the dependent variable.

Happy Learning!