Analysis of Regression Models and Hypothesis Testing in Predicting Dependent Variables

Explores regression models and hypothesis testing for predictive analysis.

Amelia Ward
Contributor
4.1
46
10 months ago
Preview (4 of 13 Pages)
100%
Log in to unlock

Page 1

Analysis of Regression Models and Hypothesis Testing in Predicting Dependent Variables - Page 1 preview image

Loading page ...

1Analysis of Regression Models and Hypothesis Testing in PredictingDependent Variables578Assignment-5 (Chs. 13 and 14)-solutions:Due bymidnight of Sunday,December2nd, 2012: drop box 4): 70 pointsTrue/False(One point each)Chapter 131.Thestandard error of the estimate (standard error) is the estimated standarddeviation of the distribution of the independent variable (X).FALSEit is the estimate of the standard deviation of the error term2.In a simple linear regression model, thecoefficient of determinationonlyindicates the strength of the relationship between independent and dependentvariable, butdoes not showwhether the relationship is positive or negative.TRUER2is greaterthan or equal to 0, no negative3.When usingsimple regression analysis, if there is a strong correlation betweenthe independent and dependent variable, then we can conclude that an increase inthe value of the independent variable causes an increase in the value of thedependent variable.FALSEthe strong correlation could be negative4.The error term is the difference between an individual value of the dependentvariable and the corresponding mean value of the dependent variable.FALSEit is the difference between an individual value of thedependentvariable and the corresponding predicted value (not the mean value) :residual and error term are the same thing5.In bi-variate regression the Coefficient of Determination is always equal to thesquare of the correlation coefficient.TRUE

Page 2

Analysis of Regression Models and Hypothesis Testing in Predicting Dependent Variables - Page 2 preview image

Loading page ...

Page 3

Analysis of Regression Models and Hypothesis Testing in Predicting Dependent Variables - Page 3 preview image

Loading page ...

26.In Regression Analysis if the variance of the error term is constant, we call it theHeteroscedasticity property.FALSE(instruction page 10-11)Chapter 147. When the F test is used to test the overall significance of a multiple regressionmodel, ifthe null hypothesis is rejected, it can be concluded that all of theindependent variables X1, X2,Xkare significantly related to the dependentvariable Y.FALSEwe can conclude thatat least one(not all)….8. An application of the multiple regression model generated the following resultsinvolving the F test of the overall regression model:p-value=.0012,R2=.67 ands=.076. Thus, the null hypothesis, which states that none of the independentvariables are significantly related to the dependent variable, should be rejectedeven at the .01 level of significance.TRUEsince p-value is less than 0.019.High Multicollinearity problem occurs when the Independent variables arehighly correlated with the Dependent variable.FALSEIt occurs when there is highlinear relationamong the Independent variables.10. The assumption of independent error terms in regression analysis is oftenviolated when using time series dataand is called the problem of Autocorrelation.TRUEsee Instructions11. Homoscedasticity problem occurs when the assumption of constant errorvariance is violated.FALSE.This problem is called Heteroscedasticity andfrequently occurs in cross-sectional data.Multiple Choices(Two points each)Chapter 13

Page 4

Analysis of Regression Models and Hypothesis Testing in Predicting Dependent Variables - Page 4 preview image

Loading page ...

31.All of the following are assumptions of the error terms in the simple linearregression model except:A.Errors are normally distributedB.Error terms have a mean of zeroC.Error terms have aconstant varianceD.Error termsdepend on the explanatory variable(Instruction page 10-11, Book page 530)2.The point estimate of the variance in a regression model isA.SSEB.MSEC.seD.b13. The least squares regression line minimizes thesum of theA.Sum of Differences between actual and predicted Y valuesB.Sum of Squared differences between actual and predicted X valuesC.Sum of Absolute deviations between actual and predicted X valuesD.Sum of Absolute deviations betweenactual and predicted Y valuesE.Sum of Squared differences between actual and predicted Y values4.The ___________ theR2and the __________ the s (standard error), the strongerthe relationship between the dependent variable and the independent variable.A.Higher, lowerB.Lower, higherC.Lower, lowerD.Higher, higher5. In simple bivariate regression analysis, if the correlation coefficient is a positivevalue, thenA.The Y intercept must also be a positive value.B.The coefficient of determination can be either positive or negative, dependingon the value of the slope.C.The least squares regression equation could either have a positive or a negative
Preview Mode

This document has 13 pages. Sign in to access the full document!