• MBA 8350 Course Companion
  • Preface
    • About this book…
    • Acknowledgements
  • 1 Introduction
    • 1.1 The “Big Picture” of Statistics
    • 1.2 The Vocabulary of Statistics
    • 1.3 Descriptive Measures
      • 1.3.1 Central Tendency
      • 1.3.2 Variation
      • 1.3.3 Measures of shape
      • 1.3.4 Covariance and Correlation
  • 2 Data Collection and Sampling
    • 2.1 Sampling Distributions
    • 2.2 Sampling Bias - two examples
      • 2.2.1 Dewey Defeats Truman?
      • 2.2.2 98.6?
    • 2.3 Sampling Methods
      • 2.3.1 Simple random sampling
      • 2.3.2 Systematic Sampling
      • 2.3.3 Stratified Sampling
      • 2.3.4 Cluster Sampling
    • 2.4 Sampling in Practice
    • 2.5 Sampling and Sampling Distributions
      • 2.5.1 An Application
  • 3 Getting Started with R
    • 3.1 The R Project for Statistical Computing
    • 3.2 Downloading and installing R
      • 3.2.1 Choosing a Mirror
      • 3.2.2 Download and install the correct version
      • 3.2.3 Downloading and installing RStudio
      • 3.2.4 Taking Stock
      • 3.2.5 Installing Packages
    • 3.3 Coding Basics
      • 3.3.1 Assigning Objects
      • 3.3.2 Listing, Adding, and Removing
      • 3.3.3 Loading Data
      • 3.3.4 Manipulating Data
      • 3.3.5 Subsetting Data
    • 3.4 Data Visualization
      • 3.4.1 Histograms
      • 3.4.2 Line, bar, and Scatter Plots
      • 3.4.3 Boxplots
      • 3.4.4 Much more out there
  • 4 The Central Limit Theorem
    • 4.1 The CLT (Formally)
    • 4.2 Application 1: A Sampling Distribution with a Known Population
    • 4.3 Application 2: A Sampling Distribution with an Unknown Population
      • 4.3.1 The Sample
    • 4.4 The Punchline
  • 5 Confidence Intervals
    • 5.1 A Refresher on Probability
      • 5.1.1 Application 1
      • 5.1.2 Application 2
    • 5.2 Deriving a Confidence Interval
      • 5.2.1 Application 3
      • 5.2.2 What if we want to change confidence?
    • 5.3 What to do when we do not know \(\sigma\)
      • 5.3.1 t distribution versus Z distribution…
      • 5.3.2 Application 4
    • 5.4 Determining Sample Size
    • 5.5 Concluding Applications
      • 5.5.1 Light Bulbs (Last Time)
      • 5.5.2 Returning to the Philadelphia School Policy Application
  • 6 Hypothesis Tests
    • 6.1 Anatomy of a Hypothesis Test
    • 6.2 Two methods for conducting a hypothesis test (when \(\sigma\) is known)
      • 6.2.1 Rejection Region Method
      • 6.2.2 P-value Approach
    • 6.3 Two-sided vs One-sided Test
    • 6.4 Conducting a hypothesis test (when \(\sigma\) is unknown)
    • 6.5 Appendix: A note on calculating P-values
      • 6.5.1 The Problem
      • 6.5.2 How to calculate p-values
  • 7 Simple Linear Regression
    • 7.1 A Simple Linear Regression Model
      • 7.1.1 What does a regression model imply?
      • 7.1.2 The REAL Simple Linear Regression Model
    • 7.2 Application: Predicting House Price Based on House Size
    • 7.3 Ordinary Least Squares (OLS)
      • 7.3.1 B.L.U.E.
    • 7.4 Decomposition of Variance
      • 7.4.1 The \(R^2\)
      • 7.4.2 What is a good \(R^2\)?
      • 7.4.3 Standard Error of the Estimate
    • 7.5 Assumptions of the Linear Regression Model
      • 7.5.1 Linearity
      • 7.5.2 Independence of Errors
      • 7.5.3 Equal Variance
      • 7.5.4 Normality of Errors
    • 7.6 Statistical Inference
      • 7.6.1 Confidence Intervals (around population parameters)
      • 7.6.2 Hypothesis Tests
      • 7.6.3 Confidence Intervals (around forecasts)
    • 7.7 Up Next…
  • 8 Multiple Linear Regression
    • 8.1 Application: Explaining house price in a multiple regression
      • 8.1.1 The Importance of “Controls”
    • 8.2 Adjusted \(R^2\)
      • 8.2.1 Abusing an \(R^2\)
      • 8.2.2 An Adjusted \(R^2\)
    • 8.3 Statistical Inference
      • 8.3.1 Hypothesis Tests
      • 8.3.2 Confidence Intervals (around population parameters)
      • 8.3.3 Confidence Intervals (around forecasts)
    • 8.4 Qualitative (Dummy) Variables
      • 8.4.1 Intercept dummy variable
      • 8.4.2 Slope dummy variable
      • 8.4.3 What if there are more than two categories?
      • 8.4.4 A Final Application
    • 8.5 Joint Hypothesis Tests
      • 8.5.1 Simple versus Joint Tests
      • 8.5.2 Applications
  • 9 Advanced Regression Topics
    • 9.1 Nonlinear Models
      • 9.1.1 Derivatives
      • 9.1.2 Why consider non-linear relationships?
      • 9.1.3 Functional Forms
      • 9.1.4 The Log transformation
      • 9.1.5 The Quadratic transformation
      • 9.1.6 The Reciprocal transformation
      • 9.1.7 Conclusion
    • 9.2 Collinearity
      • 9.2.1 An Application
      • 9.2.2 What does Collinearity do to our regression?
      • 9.2.3 How to test for Collinearity?
      • 9.2.4 An Application:
      • 9.2.5 How do we remove Collinearity?
    • 9.3 Heteroskedasticity
      • 9.3.1 Pure versus Impure Heteroskedasticity
      • 9.3.2 Consequences of Heteroskedasticity
      • 9.3.3 Detection
      • 9.3.4 Remedies
  • Published with bookdown

MBA 8350: Course Companion for Analyzing and Leveraging Data

MBA 8350: Course Companion for Analyzing and Leveraging Data

Scott Dressler

2021-12-21