Inference for Regression von David Spade, PhD

video locked

Über den Vortrag

Der Vortrag „Inference for Regression“ von David Spade, PhD ist Bestandteil des Kurses „Statistics Part 2“. Der Vortrag ist dabei in folgende Kapitel unterteilt:

  • Inference for Regression
  • Example: Body Fat
  • Pitfalls to Avoid

Quiz zum Vortrag

  1. The residuals can be viewed as estimates of the mean value of the response variable for each value of the explanatory variable.
  2. The response variable is assumed to have a normal distribution for each value of the explanatory variable.
  3. By performing linear regression, we are estimating the mean value of the response variable for each value of the explanatory variable.
  4. The fitted values of the response variable are used as estimates of the mean value of the response for each value of the explanatory variable.
  5. The residuals are an estimate of deviation from the estimated mean.
  1. The test statistic follows a t-distribution with n−2 degrees of freedom.
  2. The test statistic follows a normal-distribution with mean 0 and variance 1.
  3. The test statistic follows a t-distribution with n degrees of freedom.
  4. The test statistic follows a t-distribution with n−1 degrees of freedom.
  5. The test statistic follows a t-distribution with n+1 degrees of freedom.
  1. The scatterplot of the response variable against the explanatory variable shows random scatter about 0.
  2. The residuals are nearly normal.
  3. The plot does not thicken.
  4. The observations are independent of each other.
  5. A scatterplot of the response through the explanatory variable is nearly linear.
  1. We quantify the spread around the regression line by using the standard error of the slope.
  2. We quantify spread around the regression line by using the standard error of the intercept term.
  3. We quantify spread around the regression line by using the mean value of the explanatory variable.
  4. We quantify spread around the regression line by using only the sample standard deviation of the residuals.
  5. We quantify spread around the regression line by subtracting the standard deviation from the mean.
  1. Fitting a linear regression to data that is not linear will not have a negative effect on the inference procedures for a slope.
  2. Being careful of thickening plots will not have a negative effect on the inference procedures for a slope.
  3. Making sure that the residuals are nearly normal will not have a negative effect on the inference procedures for a slope.
  4. Being careful of outliers and influential points will not have a negative effect on the inference procedures for a slope.
  5. Verify if a one-tail or two-tail test is appropriate.

Dozent des Vortrages Inference for Regression

 David Spade, PhD

David Spade, PhD

Dr. David Spade is an Assistant Professor of Mathematical Sciences and Statistics at the University of Wisconsin-Milwaukee and holds a courtesy appointment as an Assistant Professor of Statistics at the University of Missouri-Kansas City, USA.
He obtained his MS in Statistics in 2010 and then completed his PhD in Statistics from Ohio State University in 2013.
An experienced mathemathics instructor, Dr. Spade has been teaching diverse statistics courses from the introductory to the graduate level since 2007.
Within Lecturio, he teaches courses on Statistics.


Kundenrezensionen

(1)
5,0 von 5 Sternen
5 Sterne
5
4 Sterne
0
3 Sterne
0
2 Sterne
0
1  Stern
0