回歸分析(英文) | 被動收入的投資秘訣 - 2024年5月

回歸分析(英文)

作者:武萍,吳賢毅
出版社:清華大學
出版日期:2016年08月01日
ISBN:9787302446309
語言:繁體中文
售價:204元

內容分為三部分:(1)線性回歸分析所需要的矩陣理論、多元正態分布;(2)線性回歸的基本理論和方法,包括線性估計的一般小樣本理論、關於線性假設的 F-檢驗方法、基於線性模型的方差分析理論、變量選擇問題的討論、共線性問題、異常值問題以及Box-Cox 模型等與線性回歸相關的內容;(3)用於分類響應變量的Logist回歸模型的基本理論和方法。《回歸分析(英文)》要求讀者具有高等代數(或者線性代數)和概率論與數理統計的良好基礎。本書的特點之一是在盡可能少的基礎知識要求下講清線性回歸分析的理論問題,同時,本書也附帶了一些SAS代碼,這將有助於實際應用中的數據處理。《回歸分析(英文)》可供統計學專業、數學專業或者其他相關專業作為本科生回歸分析課程教材使用,也可作為非統計學專業的研究人員學習回歸分析基礎理論的參考書使用。《回歸分析(英文)》內容分為三部分:(1)線性回歸分析所需要的矩陣理論、多元正態分布;(2)線性回歸的基本理論和方法,包括線性估計的一般小樣本理論、關於線性假設的F—檢驗方法、基於線性模型的方差分析理論、變量選擇問題的討論、共線性問題、異常值問題以及Box—Cox模型等與線性回歸相關的內容;(3)用於分類響應變量的Logist回歸模型的基本理論和方法。吳賢毅,華東師范大學金融與統計學院教授,博士生導師,研究領域包括隨機調度,概率統計,非壽險精算,在隨機調度,概率統計以及非壽險精算的國際主流雜志發表過數十篇學術研究論文,在隨機調度方面的研究獲得過三次國家自然科學基金資助,其在隨機調度方面的研究成果發表於Operations Research,European Journal of Operations Research,Journal of Scheduling 等。

Chapter 1 Preliminaries: Matrix Algebra and Random Vectors1.1 Preliminary matrix algebra1.1.1 Trace and eigenvalues1.1.2 Symmetric matrices1.1.3 Idempotent matrices and orthogonal projection1.1.4 Singular value decomposition1.1.5 Vector differentiation and generalized inverse1.1.6 Exercises1.2 Expectation and covariance1.2.1 Basic properties1.2.2 Mean and variance of quadratic forms1.2.3 Exercises1.3 Moment generating functions and independence1.3.1 ExercisesChapter 2 Multivariate Normal Distributions2.1 Definitions and fundamental results2.2 Distribution of quadratic forms2.3 ExercisesChapter 3 Linear Regression Models3.1 Introduction3.2 Regression interpreted as conditional mean3.3 Linear regression interpreted as linear prediction3.4 Some nonlinear regressions3.5 Typical data structure of linear regression models3.6 ExercisesChapter 4 Estimation and Distribution Theory4.1 Least squares estimation (LSE)4.1.1 Motivation: why is LS reasonable4.1.2 The LS solution4.1.3 Exercises4.2 Properties of LSE4.2.1 Small sample distribution—free properties4.2.2 Properties under normally distributed errors4.2.3 Asymptotic properties4.2.4 Exercises4.3 Estimation under linear restrictions4.4 Introducing further explanatory variables and related topics4.4.1 Introducing further explanatory variables4.4.2 Centering and scaling the explanatory variables4.4.3 Estimation in terms of linear prediction4.4.4 Exercises4.5 Design matrices of less than full rank4.5.1 An example4.5.2 Estimability4.5.3 Identifiability under Constraints4.5.4 Dropping variables to change the model4.5.5 Exercises4.6 Generalized least squares4.6.1 Basic theory4.6.2 Incorrect specification of variance matrix4.6.3 Exercises4.7 Bayesian estimation4.7.1 The basic idea4.7.2 Normal—noninformative structure4.7.3 Conjugate priors4.8 Numerical examples4.9 ExercisesChapter 5 Testing Linear Hypotheses5.1 Linear hypotheses5.2 F—Test5.2.1 F—test5.2.2 What are actually tested5.2.3 Examples5.3 Confidence ellipse5.4 Prediction and calibration5.5 Multiple correlation coefficient5.5.1 Variable selection5.5.2 Multiple correlation coefficient: straight line5.5.3 Multiple correlation coefficient: multiple regression5.5.4 Partial correlation coefficient5.5.5 Adjusted multiple correlation coefficient5.6 Testing linearity: goodness—of—fit test5.7 Multiple comparisons5.7.1 Simultaneous inference5.7.2 Some classical methods for simultaneous intervals5.8 Univariate analysis of variance5.8.1 ANOVA model5.8.2 ANCOVA model5.8.3 SAS procedures for ANOVA5.9 ExercisesChapter 6 Variable Selection6.1 Impact of variable selection6.2 Mallows’’ Cp6.3 Akaike’’s information criterion (AIC)6.3.1 Prelimilaries: asymptotic normality of MLE6.3.2 Kullback—Leibler’’s distance6.3.3 Akaike’’s Information Criterion6.3.4 AIC for linear regression6.4 Bayesian information criterion (BIC)6.5 Stepwise variable selection procedures6.6 Some newly proposed methods6.6.1 Penalized RSS6.6.2 Nonnegative garrote6.7 Final remarks on variable selection6.8 ExercisesChapter 7 Miscellaneous for Linear Regression7.1 Collinearity7.1.1 Introduction7.1.2 Examine collinearity7.1.3 Remedies*7.2 Some remedies for collinearity7.2.1 Ridge regression7.2.2 Principal Component Regression7.2.3 Partial least square7.2.4 Exercises7.3 Outliers7.3.1 Introduction7.3.2 Single outlier7.3.3 Multiple outliers7.3.4 Relevant quantities7.3.5 Remarks7.4 Testing features of errors7.4.1 Serial correlation and Durbin—Watson test7.4.2 Testing heteroskeasticity and related topics7.5 Some extensions and variants7.5.1 Box—Cox model7.5.2 Modeling the variances7.5.3 A remarkChapter 8 Logistic Regression: Modeling Categorical Responses8.1 Logistic regression8.1.1 Logistic regression for dichotomous responses8.1.2 Likelihood function for logistic regression8.1.3 Interpreting the logistic regression8.2 Multiple logistic regression8.2.1 Maximum likelihood estimation for multiple logistic regression8.3 Inference for logistic regression8.3.1 Inference for simple logistic regression8.3.2 Inference for multiple logistic regression8.4 Logistic regression for multinomial responses8.4.1 Nominal responses baseline—category logistic regression8.4.2 Ordinal responses: cumulative logistic regression8.5 ExercisesBibliography


相關書籍