Linear Regression von David Spade, PhD

video locked

Über den Vortrag

Der Vortrag „Linear Regression“ von David Spade, PhD ist Bestandteil des Kurses „Statistics Part 1“. Der Vortrag ist dabei in folgende Kapitel unterteilt:

  • Linear Regression
  • The Y-Intercept
  • Making Predictions
  • The R-squared Quantity
  • After Regression

Quiz zum Vortrag

  1. The line represented by the linear model goes through every data point.
  2. The linear model is the equation of a straight line that goes through our data.
  3. The linear model can be used to model the relationship between two quantitative variables.
  4. A line with a good fit will have small residuals.
  5. The linear model cannot be used for qualitative variables.
  1. The slope of the regression line is equal to the correlation.
  2. Correlation can, in many cases, give insight into how well the linear model will fit our data.
  3. We must be careful in using correlation to describe how well the linear model will fit our data.
  4. Correlation tells us whether the slope of the regression line is positive or negative.
  5. The intersection of the y-axis does not provide meaningful information about correlation.
  1. For a one-unit increase in the value of X, we expect a decrease of 0.275 units in the value of Y.
  2. For a one-unit increase in the value of X, we expect an increase of 0.275 in the value of Y.
  3. For a one-unit increase in the value of X , we expect a 12.5 unit increase in the value of Y .
  4. If Y = 0, then X = 12.5.
  5. For a one-unit decrease in the value of X, we expect a decrease of 0.275 units in the value of Y.
  1. High values of R² mean that the changes in the value of X cause the changes in the value of Y.
  2. The R² quantity can be used to assess how well the line fits the data in many cases.
  3. The R² quantity provides the percentage of variation in the response variable that is explained by the regression on the explanatory variable.
  4. The R² quantity is calculated by squaring the correlation.
  5. R² ranges from 0 to 1.
  1. The vertical distance from the observed value to the regression line
  2. The horizontal distance from the observed value to the regression line
  3. The vertical distance from the expected value to the regression line
  4. The horizontal distance from the expected value to the regression line
  5. The lateral distance from the expected value to the regression line
  1. 134.4
  2. 30
  3. -34.4
  4. 34.4
  5. 100
  1. 15.6
  2. 0
  3. -15.6
  4. 34.4
  5. 100
  1. 1.5
  2. 0
  3. 1
  4. 0.375
  5. 0.75
  1. 4
  2. 0
  3. 1
  4. 2
  5. 3
  1. GDP = 100 + 0.9* Health expenditure
  2. GDP = 100 – 0.9* Health expenditure
  3. GDP = 100 + 0.9* Health expenditure + 0.23* Health expenditure²
  4. Health expenditure = 100 – 0.9*GDP
  5. Health expenditure = 100 + 0.9*GDP

Dozent des Vortrages Linear Regression

 David Spade, PhD

David Spade, PhD

Dr. David Spade is an Assistant Professor of Mathematical Sciences and Statistics at the University of Wisconsin-Milwaukee and holds a courtesy appointment as an Assistant Professor of Statistics at the University of Missouri-Kansas City, USA.
He obtained his MS in Statistics in 2010 and then completed his PhD in Statistics from Ohio State University in 2013.
An experienced mathemathics instructor, Dr. Spade has been teaching diverse statistics courses from the introductory to the graduate level since 2007.
Within Lecturio, he teaches courses on Statistics.


Kundenrezensionen

(1)
5,0 von 5 Sternen
5 Sterne
5
4 Sterne
0
3 Sterne
0
2 Sterne
0
1  Stern
0