Inference for Regression by David Spade, PhD

video locked

About the Lecture

The lecture Inference for Regression by David Spade, PhD is from the course Statistics Part 2. It contains the following chapters:

  • Inference for Regression
  • Example: Body Fat
  • Pitfalls to Avoid

Included Quiz Questions

  1. The residuals can be viewed as estimates of the mean value of the response variable for each value of the explanatory variable.
  2. The response variable is assumed to have a normal distribution for each value of the explanatory variable.
  3. By performing linear regression, we are estimating the mean value of the response variable for each value of the explanatory variable.
  4. The fitted values of the response variable are used as estimates of the mean value of the response for each value of the explanatory variable.
  5. The residuals are an estimate of deviation from the estimated mean.
  1. The test statistic follows a t-distribution with n−2 degrees of freedom.
  2. The test statistic follows a normal-distribution with mean 0 and variance 1.
  3. The test statistic follows a t-distribution with n degrees of freedom.
  4. The test statistic follows a t-distribution with n−1 degrees of freedom.
  5. The test statistic follows a t-distribution with n+1 degrees of freedom.
  1. The scatterplot of the response variable against the explanatory variable shows random scatter about 0.
  2. The residuals are nearly normal.
  3. The plot does not thicken.
  4. The observations are independent of each other.
  5. A scatterplot of the response through the explanatory variable is nearly linear.
  1. We quantify the spread around the regression line by using the standard error of the slope.
  2. We quantify spread around the regression line by using the standard error of the intercept term.
  3. We quantify spread around the regression line by using the mean value of the explanatory variable.
  4. We quantify spread around the regression line by using only the sample standard deviation of the residuals.
  5. We quantify spread around the regression line by subtracting the standard deviation from the mean.
  1. Fitting a linear regression to data that is not linear will not have a negative effect on the inference procedures for a slope.
  2. Being careful of thickening plots will not have a negative effect on the inference procedures for a slope.
  3. Making sure that the residuals are nearly normal will not have a negative effect on the inference procedures for a slope.
  4. Being careful of outliers and influential points will not have a negative effect on the inference procedures for a slope.
  5. Verify if a one-tail or two-tail test is appropriate.

Author of lecture Inference for Regression

 David Spade, PhD

David Spade, PhD


Customer reviews

(1)
5,0 of 5 stars
5 Stars
5
4 Stars
0
3 Stars
0
2 Stars
0
1  Star
0

or
Unlock lecture 2.90
USD3.10
GBP2.46