Detailed Table of Contents

John Fox, Applied Regression Analysis, Linear Models, and Related Methods (Sage, 1997)

      To Readers, Students, and Instructors  
 Part I Preliminaries  
  Chapter 1 Statistics and Social Science   
       1.1 Statistical Models and Social Reality  
       1.2 Observation and Experiment   
       1.3 Populations and Samples   
       1.4 Summary    
       1.5 Recommended Reading  
  Chapter 2 What Is Regression Analysis?         
       2.1 Preliminaries  
       2.2 Naive Nonparametric Regression   
       2.3 Local Averaging  
            2.3.1 Weighted Local Averages*    
       2.4 Summary    
  Chapter 3 Examining Data    
       3.1  Univariate Displays    
            3.1.1 Histograms  
            3.1.2 Density Estimation*   
            3.1.3 Quantile-Comparison Plots      
            3.1.4 Boxplots  
       3.2 Plotting Bivariate Data  
       3.3 Plotting Multivariate Data  
       3.4 Summary       
       3.5 Recommended Reading  
  Chapter 4 Transforming Data   
       4.1 The Family of Powers and Roots  
       4.2 Transforming Skewness            
       4.3 Transforming Nonlinearity     
       4.4 Transforming Non-Constant Spread   
       4.5 Transforming Proportions      
       4.6 Summary     
       4.7 Recommended Reading  
 Part II: Linear Models and Least Squares   
  Chapter 5 Linear Least-Squares Regression  
       5.1 Simple Regression  
            5.1.1 Least-Squares Fit      
            5.1.2 Simple Correlation  
       5.2 Multiple Regression    
            5.2.1 Two Independent Variables    
            5.2.2 Several Independent Variables   
            5.2.3 Multiple Correlation   
            5.2.4 Standardized Regression Coefficients    
       5.3 Summary     
  Chapter 6 Statistical Inference for Regression   
       6.1 Simple Regression  
            6.1.1 The Simple-Regression Model    
            6.1.2 Properties of the Least-Squares Estimator  
            6.1.3 Confidence Intervals and Hypothesis Tests  
       6.2 Multiple Regression   
            6.2.1 The Multiple-Regression Model    
            6.2.2 Confidence Intervals and Hypothesis Tests  
                Individual Slope Coefficients    
                All Slopes   
                A Subset of Slopes    
       6.3 Empirical versus Structural Relations   
       6.4 Measurement Error in Independent Variables*    
       6.5 Summary    
  Chapter 7 Dummy-Variable Regression  
       7.1 A Dichotomous Independent Variable   
       7.2 Polytomous Independent Variables   
       7.3 Modeling Interactions   
            7.3.1 Constructing Interaction Regressors    
            7.3.2 The Principle of Marginality  
            7.3.3 Interactions With Polytomous Independent Variables  
            7.3.4 Hypothesis Tests for Main Effects and Interactions     
       7.4 A Caution Concerning Standardized Coefficients  
       7.5 Summary   
  Chapter 8 Analysis of Variance   
       8.1 One-Way Analysis of Variance   
       8.2 Two-Way Analysis of Variance   
            8.2.1 Patterns of Means in the Two-Way Classification  
            8.2.2 The Two-Way ANOVA Model  
            8.2.3 Fitting the Two-Way ANOVA Model to Data    
            8.2.4 Testing Hypotheses in Two-Way ANOVA  
            8.2.5 Equal Cell Frequencies   
            8.2.6 Some Cautionary Remarks  
       8.3 Higher-Way Analysis of Variance*      
            8.3.1 The Three-Way Classification   
            8.3.2 Higher-Order Classifications    
            8.3.3 Empty Cells in ANOVA   
       8.4 Analysis of Covariance   
       8.5 Linear Contrasts of Means   
       8.6 Summary   
  Chapter 9 Statistical Theory for Linear Models*   
       9.1 Linear Models in Matrix Form   
            9.1.1 Dummy Regression and Analysis of Variance  
            9.1.2 Linear Contrasts  
       9.2 Least-Squares Fit   
       9.3 Properties of the Least-Squares Estimator  
            9.3.1 The Distribution of the Least-Squares Estimator  
            9.3.2 The Gauss-Markov Theorem  
            9.3.3 Maximum-Likelihood Estimation    
       9.4 Statistical Inference for Linear Models  
            9.4.1 Inference for Individual Coefficients  
            9.4.2 Inference for Several Coefficients  
            9.4.3 General Linear Hypotheses   
            9.4.4 Joint Confidence Regions  
       9.5 Random Regressors   
       9.6 Specification Error    
       9.7 Summary  
       9.8 Recommended Reading  
  Chapter 10 The Vector Geometry of Linear Models*     
       10.1 Simple Regression   
            10.1.1 Variables in Mean-Deviation Form   
            10.1.2 Degrees of Freedom 
       10.2 Multiple Regression  
       10.3 Estimating the Error Variance    
       10.4 Analysis-of-Variance Models  
       10.5 Summary   
       10.6 Recommended Reading    
 Part III: Linear-Model Diagnostics   
  Chapter 11 Unusual and Influential Data  
       11.1 Outliers, Leverage, and Influence   
       11.2 Assessing Leverage: Hat-Values  
       11.3 Detecting Outliers: Studentized Residuals   
            11.3.1 Testing for Outliers in Linear Models   
            11.3.2 Anscombe's Insurance Analogy   
       11.4 Measuring Influence  
            11.4.1 Influence on Standard Errors   
            11.4.2 Influence on Collinearity  
       11.5 Numerical Cutoffs for Diagnostic Statistics  
            11.5.1 Hat-Values  
            11.5.2 Studentized Residuals         
            11.5.3 Measures of Influence  
       11.6 Joint Influence and Partial-Regression Plots  
       11.7 Should Unusual Data Be Discarded?  
       11.8 Some Statistical Details*   
            11.8.1 Hat-Values and the Hat Matrix         
            11.8.2 The Distribution of the Least-Squares Residuals    
            11.8.3 Deletion Diagnostics   
            11.8.4 Partial-Regression Plots    
       11.9 Summary   
       11.10 Recommended Reading         
  Chapter 12 Nonlinearity and Other Ills         
       12.1 Non-Normally Distributed Errors   
            12.1.1 Confidence Envelopes by Simulated Sampling*  
       12.2 Non-Constant Error Variance    
            12.2.1 Residual Plots   
            12.2.2 Weighted-Least-Squares Estimation*  
            12.2.3 Correcting OLS Standard Errors for Non-Constant Variance*   
            12.2.4 How Non-Constant Error Variance Affects the OLS Estimator*   
       12.3 Nonlinearity   
            12.3.1 Partial-Residual Plots   
            12.3.2 When Do Partial-Residual Plots Work?    
                CERES Plots*  
       12.4 Discrete Data  
            12.4.1 Testing for Nonlinearity  `Lack of Fit')    
            12.4.2 Testing for Non-Constant Error Variance  
       12.5 Maximum-Likelihood Methods*   
            12.5.1 Box-Cox Transformation of Y   
            12.5.2 Box-Tidwell Transformation of the X's   
            12.5.3 Non-Constant Error Variance Revisited   
       12.6 Structural Dimension*  
       12.7 Summary   
       12.8 Recommended Reading   
  Chapter 13 Collinearity  
       13.1 Detecting Collinearity  
            13.1.1 Principal Components*   
                Two Variables   
                The Data Ellipsoid  
                Diagnosing Collinearity    
            13.1.2 Generalized Variance Inflation*   
       13.2 Coping With Collinearity: No Quick Fix   
       13.2.1 Model Re-Specification  
       13.2.2 Variable Selection  
       13.2.3 Biased Estimation  
           Ridge Regression*   
       13.2.4 Prior Information About the Regression Coefficients   
       13.2.5 Some Comparisons  
       13.3 Summary  
 Part IV: Beyond Linear Least Squares  
  Chapter 14 Extending Linear Least Squares*   
       14.1 Time-Series Regression   
            14.1.1 Generalized Least-Squares Estimation   
            14.1.2 Serially Correlated Errors  
                GLS Estimation With Autoregressive Errors   
                Empirical GLS Estimation  
            14.1.3 Diagnosing Serially Correlated Errors         
            14.1.4 Concluding Remarks    
       14.2 Nonlinear Regression  
            14.2.1 Polynomial Regression   
            14.2.2 Transformable Nonlinearity   
            14.2.3 Nonlinear Least Squares   
       14.3 Robust Regression  
            14.3.1 M-Estimation  
                Estimating Location   
                M-Estimation in Regression  
            14.3.2 Bounded-Influence Regression  
       14.4 Nonparametric Regression  
            14.4.1 Smoothing Scatterplots by Lowess    
                Selecting the Span    
                Statistical Inference  
            14.4.2 Additive Regression Models   
                Fitting the Additive Regression Model   
                Statistical Inference  
                Semi-Parametric Models    
       14.5 Summary   
           Time-Series Regression   
           Nonlinear Regression  
           Robust Regression     
           Nonparametric Regression   
       14.6 Recommended Reading  
  Chapter 15 Logit and Probit Models  
       15.1 Models for Dichotomous Data   
            15.1.1 The Linear-Probability Model   
            15.1.2 Transformations of pi: Logit and Probit Models   
            15.1.3 An Unobserved-Variable Formulation   
            15.1.4 Logit and Probit Models for Multiple Regression  
            15.1.5 Estimating the Linear Logit Model*   
            15.1.6 Diagnostics for Logit Models*  
                Residuals in Logit model         
                Residual and Partial-Residual Plots   
                Hat-Values and the Hat-Matrix  
                Studentized Residuals  
                Influence Diagnostics    
                Partial-Regression Plot   
                Constructed-Variable Plot for Transforming an X   
       15.2 Models for Polytomous Data   
            15.2.1 The Polytomous Logit Model   
                Details of Estimation*  
            15.2.2 Nested Dichotomies   
                Why Nested Dichotomies are Independent*   
            15.2.3 Ordered Logit and Probit Models  
            15.2.4 Comparison of the Three Approaches    
       15.3 Discrete Independent Variables  
            15.3.1 The Binomial Logit Model*   
       15.4 Generalized Linear Models*    
       15.5 Summary    
       15.6 Recommended Reading   
  Chapter 16 Assessing Sampling Variation   
       16.1 Bootstrapping  
            16.1.1 Bootstrapping Basics  
            16.1.2 Bootstrap Confidence Intervals   
                Normal-Theory Intervals    
                Percentile Intervals     
                Improved Bootstrap Intervals*    
            16.1.3 Bootstrapping Regression Models   
            16.1.4 Bootstrap Hypothesis Tests*  
            16.1.5 Bootstrapping Complex Sampling Designs  
            16.1.6 Concluding Remarks  
       16.2 Cross-Validation   
            16.2.1 An Illustration     
            16.2.2 Concluding Remarks  
       16.3 Summary  
       16.4 Recommended Reading    
  Appendix A: Notation   
  Appendix B: Vector Geometry*    
       B.1 Basic Operations  
       B.2 Vector Spaces and Subspaces   
       B.3 Orthogonality and Orthogonal Projections  
       B.4 Recommended Reading  
  Appendix C Multivariable Differential Calculus  
       C.1 Partial Derivatives  
       C.2 Lagrange Multipliers   
       C.3 Matrix Calculus   
  Appendix D Probability and Estimation  
       D.1 Elementary Probability Theory  
            D.1.1 Basic Definitions  
            D.1.2 Random Variables   
                Vector Random Variables*         
            D.1.3 Transformations of Random Variables   
                Transformations of Vector Random Variables*  
       D.2 Discrete Distributions*  
            D.2.1 The Binomial Distribution  
            D.2.2 The Multinomial Distribution    
            D.2.3 The Poisson Distribution    
       D.3 Continuous Distributions  
            D.3.1 The Normal Distribution   
            D.3.2 The Chi-Square Distribution   
            D.3.3 The t-Distribution   
            D.3.4 The F-Distribution  
            D.3.5 The Multivariate-Normal Distribution*   
       D.4 Asymptotic Distribution Theory*   
            D.4.1 Probability Limits   
            D.4.2 Asymptotic Expectation and Variance   
            D.4.3 Asymptotic Distribution  
       D.5 Properties of Estimators      
            D.5.1 Bias  
                Asymptotic Bias*  
            D.5.2 Mean-Squared Error and Efficiency    
                Asymptotic Efficiency*   
            D.5.3 Consistency*  
            D.5.4 Sufficiency*   
       D.6 Maximum-Likelihood Estimation         
                Generalization of the Example*    
            D.6.1 Properties of Maximum-Likelihood Estimators*   
            D.6.2 Wald, Likelihood-Ratio, and Score Tests  
                An Illustration*         
            D.6.3 Several Parameters*     
       D.7 Recommended Reading    

Last Modified: 22 January 1997 by John Fox,