True/False Questions
1. In regression analysis, every time that an insignificant and unimportant variable is added to the regression model, the R^{2} decreases.
Answer: False Type: Concept Difficulty: Medium
2. The more variables that are added to the regression model, the better the model will fit the data.
Answer: False Type: Concept Difficulty: Easy
3. In multiple regression there is no need to consider the Ftest, only the ttests are important.
Answer: False Type: Concept Difficulty: Medium
4. Using multiple regression to regress five independent variables to predict y will give the same result as five separate regressions of y versus each independent variable.
Answer: False Type: Concept Difficulty: Medium
5. The degrees of freedom for error in the multiple regression model are: n  (k + 1).
Answer: True Type: Concept Difficulty: Easy
6. In a multiple regression model which has k variables, there are k + 1 parameters to be estimated.
Answer: True Type: Concept Difficulty: Easy
7. The multiple regression model is an extension of the simple regression model, involving more than two dependent variables.
Answer: False Type: Concept Difficulty: Easy
8. The multiple regression model assumptions for e, the error term, are the same as for the simple linear regression model.
Answer: True Type: Concept Difficulty: Easy
9. When k = 2 variables, the graph of the model is a plane in three dimensions, rather than a straight line as in the case when k = 1.
Answer: True Type: Concept Difficulty: Easy
10. The symbol for the yintercept in the multiple regression model is b_{0}.
Answer: True Type: Concept Difficulty: Easy
11. If H_{0}: b_{1} = b_{2} = b_{3} _{. . } = b_{k }=0 is rejected, then we can conclude that there is no linear relationship between y and any of the k independent variables in the model.
Answer: False Type: Concept Difficulty: Medium
12. The Ftest of the ANOVA used in multiple regression is not equivalent to the ttest for the significance of the slope parameter used in simple linear regression.
Answer: True Type: Concept Difficulty: Medium
13. In testing for the existence of a linear relationship between y and any k variables, if the pvalue for this test is 0.0001, then the null hypothesis is rejected.
Answer: True Type: Concept Difficulty: Medium
14. If H_{0}: b_{1} = b_{2} = b_{3} =0 is rejected, then we know that there is a regression relationship between all three variables and y.
Answer: False Type: Concept Difficulty: Medium
15. The Mean Square Error, or MSE, is a biased estimator for the variance of the population of errors, e, denoted by s^{2}.
Answer: False Type: Concept Difficulty: Medium
16. The square root of the MSE is the standard error of the estimate.
Answer: True Type: Concept Difficulty: Medium
17. The adjusted multiple coefficient of determination always increases as new variables are added to the model, just as R^{2} does.
Answer: False Type: Concept Difficulty: Medium
18. The adjusted multiple coefficient of determination is the R^{2} with both SSE and SST divided by their degrees of freedom.
Answer: True Type: Concept Difficulty: Medium
19. When carrying out individual ttests on each of the variables in the multiple regression model, each test is independent of each other test.
Answer: False Type: Concept Difficulty: Medium
20. When independent variables are correlated with each other, multicollinearity is present.
Answer: True Type: Concept Difficulty: Easy
21. Multicollinearity may cause the signs of some estimated regression parameters to be the opposite of what we would expect.
Answer: True Type: Concept Difficulty: Medium
22. When using qualitative variables, we use an indicator variable for each of the r variables in the model.
Answer: False Type: Concept Difficulty: Medium
23. When the adjusted coefficient of determination decreases when a term is included in the multiple regression model, then that term should be retained in the model.
Answer: False Type: Concept Difficulty: Medium
24. A multiple regression model should be as parsimonious as possible.
Answer: True Type: Concept Difficulty: Medium
25. When powers of the variables are used in the model, a linear model is no longer appropriate.
Answer: False Type: Concept Difficulty: Medium
26. Inflated values of variances and standard errors of regression coefficient estimators may be a sign of multicollinearity.
Answer: True Type: Concept Difficulty: Medium
27. Removing collinear variables from a regression model is the easiest way of solving problems of multicollinearity.
Answer: True Type: Concept Difficulty: Easy
28. The test to check for firstorder autocorrelation is the DurbinWatson test.
Answer: True Type: Concept Difficulty: Easy
29. Multiple regression uses one independent variable.
Answer: False Type: Concept Difficulty: Easy
30. There are no mathematical limitations on the number of independent variables in a multiple regression model.
Answer: True Type: Concept Difficulty: Easy
Multiple Choice Questions
31. In a multiple regression analysis with n = 15 and k = 14, the computer reports R^{2} 0.9999.
A) this is an excellent regression
B) this is a very good regression
C) this is an average regression
D) this is not a good regression
E) not enough information to determine
Answer: D Type: Concept Difficulty: Medium
32. In multiple regression analysis, R^{2} = 0.02, n = 2,000, k = 5 and F = 11.2.
A) multicollinearity is present
B) none of the five variables are statistically significant
C) this regression is excellent for prediction purposes
D) there is some evidence of a linear relationship between y and at least some of the x variables, but the regression is extremely weak and useless for prediction purposes.
E) not enough information to make any conclusions
Answer: D Type: Concept Difficulty: Easy
33. In a multiple regression analysis, MSE = 20, n = 54, k = 3, and SST(total) = 2,000. What is the R^{2} of the regression?
A) 0.0
B) 0.01
C) 0.99
D) 1.00
E) 0.50
Answer: E Type: Computation Difficulty: Medium
34. The surface hyperplane of the linear regression of y on five independent variables is of the dimension:
A) 2
B) 3
C) 4
D) 5
E) none of the above
Answer: D Type: Concept Difficulty: Easy
35. Suppose that in a multiple regression the F is significant, but none of the tratios are significant. This means that:
A) multicollinearity may be present
B) autocorrelation may be present
C) the regression is good
D) a nonlinear model would be a better fit
E) none of the above
Answer: A Type: Concept Difficulty: Medium
36. Which of the following represents a net regression coefficient?
A) e
B) R^{2 }
C) R
D) b_{1 }
E) none of the above
Answer: D Type: Concept Difficulty: Easy
37. How many degrees of freedom for error are associated with a multiple regression model with k independent variables?
A) n  (k + 1)
B) n  k
C) n  1
D) n  k + 1
E) none of the above
Answer: A Type: Concept Difficulty: Easy
38. The F ratio used to test for the existence of' a linear relationship between the dependent variable and any independent variable is:
A) MSE/(n(k + l))
B) MSR/MSE
C) MSR/MST
D) MSE/MSR
E) none of the above
Answer: B Type: Concept Difficulty: Medium
39. When the null hypothesis, H_{0}: b_{1} = b_{2} = b_{3} =_{ }0, is rejected, the interpretation should be:
A) there is no linear relationship between y and any of the three independent variables
B) there is a regression relationship between y and at least one of the three independent variables
C) all three independent variables have a slope of zero
D) all three independent variables have equal slopes
E) there is a regression relationship between y and all three independent variables
Answer: B Type: Concept Difficulty: Medium
40. In testing H_{0}: b_{1} = b_{2} = b_{3} _{. . } = b_{k }= 0, a pvalue of 0.0001, would give an indication that:
A) the null hypothesis should not be rejected
B) the null hypothesis should be rejected
C) all three independent variables have a slope of zero
D) there is no linear relationship between y and any of the three independent variables
E) none of the above
Answer: B Type: Concept Difficulty: Medium
41. Consider the following multiple regression model, with n = 25:
y = 5 + 10x_{1} + 20x_{2}.
R^{2} = 0.90 S_{b1} = 3.2 s_{b2} = 5.5
Calculate the ttest statistic to test whether x_{1 }contributes information to the prediction of y.
A) 0.32
B) 3.636
C) 3.125
D) 2.8125
E) 11.11
Answer: C Type: Computation Difficulty: Medium
Use the following to answer questions 4247:
The following data gives the monthly sales (in thousands of dollars) for different advertising expenditures (also in thousands of dollars) and sales commission percentages.
42. What amount of sales would this model predict for advertising expenditures of 25,000 and sales commission of 8%?
A) $564,318
B) $30,273.6
C) $561,734
D) $72,880
E) none of the above
Answer: A Type: Computation Difficulty: Medium
43. Write the null and alternative hypotheses to test whether or not advertising expenditures and sales commissions can be used to predict sales.
A) H_{0}: b_{1} = b_{2} = 0; H_{1}: at least one coefficient is not zero
B) H_{0}: b_{1} ¹ b_{2} = 0; H_{1}: b_{1} = b_{2 }
C) H_{0}: b_{1} and b_{2} are not equal to zero; H_{0}: b_{1} = b_{2} = 0
D) H_{0}: b_{1} ³ b_{2}; H_{1}: b_{1} < b_{2 }
E) none of the above
Answer: A Type: Concept Difficulty: Easy
44. At the 5% level of significance, is either advertising expenditure or sales commission percentage or both significant?
A) only advertising expenditures
B) both are significant
C) only sales commission percentage
D) neither are significant
E) insufficient information to determine
Answer: A Type: Concept Difficulty: Medium
45. What is the difference between R^{2} and the adjusted R^{2}?
A) the adjusted R^{2} always increases as more independent variables are added to the model
B) the adjusted R^{2} is smaller in this case because the constant term is negative
C) the adjusted R^{2} adjusts explanatory power by the degrees of freedom
D) the adjusted R^{2} is always smaller than R^{2 }
E) the adjusted R^{2} adjusts explanatory power by division by the standard error of each coefficient
Answer: C Type: Concept Difficulty: Medium
46. What is the value of the standard error of the estimate?
A) MSE = 4822.64
B) s = 69.44
C) MSR = 42592
D) stdev = 119.0
E) not given in the above table of information
Answer: B Type: Computation Difficulty: Medium
47. The degrees of freedom for error, for this regression model are:
A) 2
B) 5
C) 24
D) 8
E) none of the above
Answer: B Type: Computation Difficulty: Medium
Use the following to answer questions 4852:
Eight students are selected randomly and their present graduate GPA is compared to their undergraduate GPA and scores on standardized tests.
The data are shown below:
The Minitab regression analysis follows:
Analysis of Variance
48. Write the regression equation, letting undergraduate GPA be variable 1 and standard scores be variable 2.
A) y = 0.4775 x_{1} + 0.0013392x_{2 }
B) y = 0.2059 + 0.1630x_{1 }+ 0.0006693x_{2 }
C) none of the others is correct
D) y = 1.1066 + 0.4775x_{1} + 0.0013392x_{2 }
E) not enough information given
Answer: D Type: Computation Difficulty: Medium
49. At the 5% level of significance, are undergraduate scores and standard scores significant?
A) both are significant
B) neither are significant
C) only undergraduate GPA is significant
D) only standard scores are significant
E) not enough information to determine
Answer: C Type: Concept Difficulty: Medium
50. Compute R^{2}.
A) 99.4%
B) 98.6%
C) 20.8%
D) very close to 100%
E) insufficient information to determine
Answer: B Type: Computation Difficulty: Hard
51. What is the relationship between R^{2} and the adjusted R^{2} for this regression model?
A) both are exactly the same
B) the adjusted R^{2} is larger than R^{2 }
C) the adjusted R^{2} is smaller than R^{2 }
D) both are very small
E) none of the above
Answer: C Type: Computation Difficulty: Hard
52. What does the F value of 170.77 indicate in this model?
A) there is no evidence of any linear relationship between y and the two independent variables
B) there is evidence of a linear relationship between y and at least one of the two independent variables
C) the coefficients of both variables are equal to zero
D) there is evidence that both variables contribute information to the prediction of y
E) none of the above
Answer: B Type: Concept Difficulty: Hard
53. All of the following are possible effects of multicollinearity EXCEPT:
A) the variances of regression coefficients estimators may be larger than expected
B) the signs of the regression coefficients may be opposite of what is expected
C) a significant F ratio may result even though the t ratios are not significant
D) removal of one data point may cause large changes in the coefficient estimates
E) the VIF is zero
Answer: E Type: Concept Difficulty: Hard
54. Correlation of the values of variables with values of the same variables lagged one or more time periods back is called:
A) multicollinearity
B) a transformation
C) autocorrelation
D) variance inflation
E) interaction
Answer: C Type: Concept Difficulty: Medium
55. An extreme observation:
A) almost always represents an error in data collection
B) never affects the analysis
C) always causes the R^{2} to be inflated
D) should always be omitted from analysis of the data
E) is an outlier
Answer: E Type: Concept Difficulty: Easy
56. Dummy variables are used when:
A) qualitative variables are involved in the model
B) quantitative variables are involved in the model
C) doing residual analysis
D) making transformations of quantitative variables
E) none of the above
Answer: A Type: Concept Difficulty: Easy
57. All of the following are possible nonlinear transformations, EXCEPT:
A) logarithmic transformation
B) multiplicative model
C) exponential model
D) reciprocal model
E) interactive model
Answer: E Type: Concept Difficulty: Medium
58. A dummy variable:
A) can only assume values of 1 or 0.
B) is used as an independent variable.
C) is an indicator variable.
D) all of the above.
Answer: D Type: Concept Difficulty: Easy
59. What is a residual?
A) Equal to R.
B) Equal to R^{2}.
C) The dependent variable.
D) The independent variable.
E) The difference between the observed and predicted value, if the dependent variable.
Answer: E Type: Concept Difficulty: Easy
60. What does a correlation matrix show?
A) Residuals.
B) Regression coefficients.
C) The correlation coefficients between all of the independent variables.
D) Both A and B, above.
Answer: C Type: Concept Difficulty: Easy
61. When independent variables are correlated, you have:
A) Multicollinearilty.
B) Homoscedasticity.
C) Autocorrelation.
D) Residuals.
Answer: A Type: Concept Difficulty: Easy
62. When successive residuals are correlated you have:
A) Multicollinearilty.
B) Homoscedasticity.
C) Autocorrelation.
D) Residuals.
Answer: C Type: Concept Difficulty: Easy
63. A multiple regression model with two independent variables exhibits a highly significant Fratio, but each variable's individual tstatistic is insignificant. The most likely cause of such a situation is ____________
A) Heteroskedasticity
B) Homoskedasticity
C) Multicollinearity
D) Nonindependence of residuals
E) Nonnormality of residuals
Answer: C Type: Concept Difficulty: Easy
64. As more independent variables are added to a multiple regression model, ___________ will increase; this is not always so with ___________, which will only increase if the additional variables add substantial explanatory power to the model.
A) R^{2}; adjusted R^{2 }
B) adjusted R^{2}; R^{2 }
C) R^{2}; the coefficient of partial determination
D) adjusted R^{2}; the coefficient of multiple determination
E) the standard error of the estimate; R^{2 }
Answer: A Type: Concept Difficulty: Medium
65. In previous studies a market researcher has observed a significant positive relationship between advertising expenditures and sales for a firm's 40 territories. In a recent iteration of her study, however, she added to her model an additional independent variable, one that tends to be highly correlated with advertising expenditures. Which of the following is NOT a possible result of this addition?
A) Advertising expenditures will exhibit a negative relationship with sales
B) The relationship between advertising expenditures and sales will be insignificant
C) The model's overall Fratio will increase
D) The model's coefficient of multiple determination (R^{2}) will decrease
E) All of the above are possible results of this addition
Answer: D Type: Concept Difficulty: Medium
Use the following to answer questions 6668:
In a multiple regression study based on 24 observations, the following results were observed:
66. If a is set at 0.05, what is the critical value for the test statistic in a test of H_{0}: b_{i} = 0 in this situation?
A) Z = 1.96
B) Z = 1.645
C) t = 2.086
D) t = 2.064
E) t = 1.725
Answer: C Type: Computation Difficulty: Medium
67. Assuming that there are no problems with multicollinearity, which of the independent variables appear to be useful or important in explaining the dependent variable?
A) Variable 1
B) Variable 2
C) Variable 3
D) Variable 1 and Variable 2
E) Variable 2 and Variable 3
Answer: E Type: Computation Difficulty: Medium
68. Assuming that all variables were determined to be useful, what would the predicted value of Y be if X_{1} = 3, X_{2} = 16 and X_{3} = 4.8?
A) 301.31
B) 266.10
C) 191.62
D) 142.57
E) 138.04
Answer: B Type: Computation Difficulty: Easy
Use the following to answer questions 6970:
A multiple regression analysis has been done on 24 observations, with the following partial ANOVA table resulting:
69. How many independent variables were included in this regression analysis?
A) 2
B) 3
C) 4
D) 5
E) 6
Answer: B Type: Computation Difficulty: Medium
70. What is adjusted R^{2} for this regression analysis?
A) 0.963
B) 0.693
C) 0.529
D) 0.458
E) 0.412
Answer: D Type: Computation Difficulty: Hard
Short Answer Questions
71. A simple linear regression model, developed with 11 observations, exhibits an R^{2} of 0.965. Adding an additional variable to the model increases R^{2} to 0.966. Has the addition of the second independent variable been useful in this situation?
Answer: Not really, since adjusted R^{2} falls to 0.958. In essence, the additional variable adds little or no predictive power.
Use the following to answer questions 7273:
A multiple regression analysis has been done on 24 observations, with the following partial ANOVA table resulting:
72. Do these results provide sufficient evidence (assume a = 0.05) of a regression relationship between the independent variables and the dependent variable?
Answer: Yes. With a = 0.05, the critical value for the Fratio with 3 numerator d.f. and 20 denominator d.f. is 3.10. The observed Fratio is 3.81142. There appears to be a significant regression relationship between the independent variables and the dependent variable.
73. How effective in predicting the dependent variable is the regression model that has been developed in this situation.
Answer: Despite its significant Fratio, the regression model developed is not very effective at explaining variability in the dependent variable, since adjusted R^{2} is only 0.268.
Aczel, Complete Business Statistics, Sixth Edition
