The R squared value, called the coefficient of determination, determines how well the data points fit on a regression equation. More specifically, the R squared value is a measure of how the independent variables in a regression equation explain the variables of the dependent variable. The value of R squared can change based on the inclusion or removal of variables in the regression model. R squared values are typically used as a measure of the effectiveness of a model. Hence, a high R squared value (anything above 55%), can be an indicator of a capable model. Continue reading The implication of R-Squared
Tag Archives: regression
Understanding Regression – Part 2
The last article focused on what regression was and how the results can be interpreted. It mentioned that there were a number of assumptions required in order for the model to be valid. The assumptions are necessary because they relate to the reasons why a regression line works well as a prediction. The assumptions are based on the residuals, which are the difference between the predicted value of the dependent variable in the regression and the actual y value in the regression. Continue reading Understanding Regression – Part 2
Understanding Regression – Part 1
Decision makers are always looking for ways to understand the effects of their actions. Managers generally assume that if they find a correlation between two items it means they understand the relationship between two variables; however, as was stated in a previous blog article, Beware of Correlations, correlations may not tell the whole story, and, furthermore, they can only tell the story between two variables. Regression allows us to understand more involved relationships between variables and an outcome variable.