Multicollinearity can lead to unreliable regression results and should be addressed from "summary" of Introduction to Econometrics by Christopher Dougherty
Multicollinearity refers to the situation where two or more independent variables in a regression model are highly correlated with each other. When multicollinearity is present, it can lead to unreliable results in the regression analysis. This is because it becomes difficult for the regression model to separate the individual effects of the highly correlated variables on the dependent variable. In the presence of multicollinearity, the estimated coefficients of the correlated variables may be unstable and have large standard errors. This means that the coefficients may not accurately represent the true relationship between the independent variables and the dependent variable. As a result, the statistical significance of the coefficients may be distorted, leading to incorrect inferences about the relationships in the data. Multicollinearity can also make it challenging to interpret the results of a regression analysis. For example, it may be difficult to determine which variables are truly important in explaining the variation in the dependent variable, as the effects of the correlated variables may be confounded. This can lead to misleading conclusions about the relationships between the variables in the model. To address multicollinearity, several strategies can be employed. One approach is to identify the highly correlated variables and consider dropping one of them from the regression model. Another strategy is to use techniques such as ridge regression or principal component analysis to mitigate the effects of multicollinearity on the regression results.- Multicollinearity can have detrimental effects on the reliability and interpretability of regression results. It is important to be aware of the presence of multicollinearity in regression analysis and take steps to address it in order to obtain accurate and meaningful insights from the data.