Amazon cover image
Image from Amazon.com

Linear models : an integrated approach / Debasis Sengupta, Sreenivasa Rao Jammalamadaka.

By: Contributor(s): Material type: TextTextSeries: Series on multivariate analysis ; v. 6.Publication details: River Edge, N.J. : World Scientific, ©2003.Description: 1 online resource (xxi, 622 pages) : illustrationsContent type:
  • text
Media type:
  • computer
Carrier type:
  • online resource
ISBN:
  • 981256490X
  • 9789812564900
  • 9789810245924
  • 9810245920
Subject(s): Genre/Form: Additional physical formats: Print version:: Linear models.DDC classification:
  • 519.5 22
LOC classification:
  • QA279 .S46 2003eb
Online resources:
Contents:
Ch. 1. Introduction. 1.1. The linear model. 1.2. Why a linear model? 1.3. Description of the linear model and notations. 1.4. Scope of the linear model. 1.5. Related models. 1.6. Uses of the linear model. 1.7. A tour through the rest of the book. 1.8. Exercises -- ch. 2. Review of linear algebra. 2.1. Matrices and vectors. 2.2. Inverses and generalized inverses. 2.3. Vector space and projection. 2.4. Column space. 2.5. Matrix decompositions. 2.6. Löwner order. 2.7. Solution of linear equations. 2.8. Optimization of quadratic forms and functions. 2.9 Exercises -- ch. 3. Review of statistical results. 3.1. Covariance adjustment. 3.2. Basic distributions. 3.3. Distribution of quadratic forms. 3.4. Regression. 3.5. Basic concepts of inference. 3.6. Point estimation. 3.7. Bayesian estimation. 3.8. Tests of hypotheses. 3.9. Confidence region. 3.10. Exercises -- ch. 4. Estimation in the linear model. 4.1. Linear estimation: some basic facts. 4.2. Least squares estimation. 4.3. Best linear unbiased estimation. 4.4. Maximum likelihood estimation. 4.5. Fitted value, residual and leverage. 4.6. Dispersions. 4.7. Estimation of error variance and canonical decompositions. 4.8. Reparametrization. 4.9. Linear restrictions. 4.10. Nuisance parameters. 4.11. Information matrix and Cramer-Rao bound. 4.12. Collinearity in the linear model. 4.13. Exercises -- ch. 5. Further inference in the linear model. 5.1. Distribution of the estimators. 5.2. Confidence regions. 5.3. Tests of linear hypotheses. 5.4. Prediction in the linear model. 5.5. Consequences of collinearity. 5.6. Exercises -- ch. 6. Analysis of variance in basic designs. 6.1. Optimal design. 6.2. One-way classified data. 6.3. Two-way classified data. 6.4. Multiple treatment/block factors. 6.5. Nested models. 6.6. Analysis of covariance. 6.7. Exercises.
Ch. 7. General linear model. 7.1. Why study the singular model? 7.2. Special considerations with singular models. 7.3. Best linear unbiased estimation. 7.4. Estimation of error variance. 7.5. Maximum likelihood estimation. 7.6. Weighted least squares estimation. 7.7. Some recipes for obtaining the BLUE. 7.8. Information matrix and Cramer-Rao bound. 7.9. Effect of linear restrictions. 7.10. Model with nuisance parameters. 7.11. Tests of hypotheses. 7.12. Confidence regions. 7.13. Prediction. 7.14. Exercises -- ch. 8. Misspecified or unknown dispersion. 8.1. Misspecified dispersion matrix. 8.2. Unknown dispersion: the general case. 8.3. Mixed effects and variance components. 8.4. Other special cases with correlated error. 8.5. Special cases with uncorrelated error. 8.6. Some problems of signal processing. 8.7. Exercises -- ch. 9. Updates in the general linear model. 9.1. Inclusion of observations. 9.2. Exclusion of observations. 9.3. Exclusion of explanatory variables. 9.4. Inclusion of explanatory variables. 9.5. Data exclusion and variable inclusion. 9.6. Exercises -- ch. 10. Multivariate linear model. 10.1. Description of the multivariate linear model. 10.2. Best linear unbiased estimation. 10.3. Unbiased estimation of error dispersion. 10.4. Maximum likelihood estimation. 10.5. Effect of linear restrictions. 10.6. Tests of linear hypotheses. 10.7. Linear prediction and confidence regions. 10.8. Applications. 10.9. Exercises -- ch. 11. Linear inference -- other perspectives. 11.1. Foundations of linear inference. 11.2. Admissible, Bayes and minimax linear estimators. 11.3. Biased estimators with smaller dispersion. 11.4. Other linear estimators. 11.5. A geometric view of BLUE in the linear model. 11.6. Large sample properties of estimators. 11.7. Exercises.
Summary: Linear Models: An Integrated Approach aims to provide a clearand deep understanding of the general linear model using simplestatistical ideas. Elegant geometric arguments are also invoked asneeded and a review of vector spaces and matrices is provided to makethe treatment self-contained.
Item type:
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Home library Collection Call number Materials specified Status Date due Barcode
Electronic-Books Electronic-Books OPJGU Sonepat- Campus E-Books EBSCO Available

Includes bibliographical references (pages 587-606) and index.

Print version record.

Linear Models: An Integrated Approach aims to provide a clearand deep understanding of the general linear model using simplestatistical ideas. Elegant geometric arguments are also invoked asneeded and a review of vector spaces and matrices is provided to makethe treatment self-contained.

Ch. 1. Introduction. 1.1. The linear model. 1.2. Why a linear model? 1.3. Description of the linear model and notations. 1.4. Scope of the linear model. 1.5. Related models. 1.6. Uses of the linear model. 1.7. A tour through the rest of the book. 1.8. Exercises -- ch. 2. Review of linear algebra. 2.1. Matrices and vectors. 2.2. Inverses and generalized inverses. 2.3. Vector space and projection. 2.4. Column space. 2.5. Matrix decompositions. 2.6. Löwner order. 2.7. Solution of linear equations. 2.8. Optimization of quadratic forms and functions. 2.9 Exercises -- ch. 3. Review of statistical results. 3.1. Covariance adjustment. 3.2. Basic distributions. 3.3. Distribution of quadratic forms. 3.4. Regression. 3.5. Basic concepts of inference. 3.6. Point estimation. 3.7. Bayesian estimation. 3.8. Tests of hypotheses. 3.9. Confidence region. 3.10. Exercises -- ch. 4. Estimation in the linear model. 4.1. Linear estimation: some basic facts. 4.2. Least squares estimation. 4.3. Best linear unbiased estimation. 4.4. Maximum likelihood estimation. 4.5. Fitted value, residual and leverage. 4.6. Dispersions. 4.7. Estimation of error variance and canonical decompositions. 4.8. Reparametrization. 4.9. Linear restrictions. 4.10. Nuisance parameters. 4.11. Information matrix and Cramer-Rao bound. 4.12. Collinearity in the linear model. 4.13. Exercises -- ch. 5. Further inference in the linear model. 5.1. Distribution of the estimators. 5.2. Confidence regions. 5.3. Tests of linear hypotheses. 5.4. Prediction in the linear model. 5.5. Consequences of collinearity. 5.6. Exercises -- ch. 6. Analysis of variance in basic designs. 6.1. Optimal design. 6.2. One-way classified data. 6.3. Two-way classified data. 6.4. Multiple treatment/block factors. 6.5. Nested models. 6.6. Analysis of covariance. 6.7. Exercises.

Ch. 7. General linear model. 7.1. Why study the singular model? 7.2. Special considerations with singular models. 7.3. Best linear unbiased estimation. 7.4. Estimation of error variance. 7.5. Maximum likelihood estimation. 7.6. Weighted least squares estimation. 7.7. Some recipes for obtaining the BLUE. 7.8. Information matrix and Cramer-Rao bound. 7.9. Effect of linear restrictions. 7.10. Model with nuisance parameters. 7.11. Tests of hypotheses. 7.12. Confidence regions. 7.13. Prediction. 7.14. Exercises -- ch. 8. Misspecified or unknown dispersion. 8.1. Misspecified dispersion matrix. 8.2. Unknown dispersion: the general case. 8.3. Mixed effects and variance components. 8.4. Other special cases with correlated error. 8.5. Special cases with uncorrelated error. 8.6. Some problems of signal processing. 8.7. Exercises -- ch. 9. Updates in the general linear model. 9.1. Inclusion of observations. 9.2. Exclusion of observations. 9.3. Exclusion of explanatory variables. 9.4. Inclusion of explanatory variables. 9.5. Data exclusion and variable inclusion. 9.6. Exercises -- ch. 10. Multivariate linear model. 10.1. Description of the multivariate linear model. 10.2. Best linear unbiased estimation. 10.3. Unbiased estimation of error dispersion. 10.4. Maximum likelihood estimation. 10.5. Effect of linear restrictions. 10.6. Tests of linear hypotheses. 10.7. Linear prediction and confidence regions. 10.8. Applications. 10.9. Exercises -- ch. 11. Linear inference -- other perspectives. 11.1. Foundations of linear inference. 11.2. Admissible, Bayes and minimax linear estimators. 11.3. Biased estimators with smaller dispersion. 11.4. Other linear estimators. 11.5. A geometric view of BLUE in the linear model. 11.6. Large sample properties of estimators. 11.7. Exercises.

eBooks on EBSCOhost EBSCO eBook Subscription Academic Collection - Worldwide

There are no comments on this title.

to post a comment.

O.P. Jindal Global University, Sonepat-Narela Road, Sonepat, Haryana (India) - 131001

Send your feedback to glus@jgu.edu.in

Hosted, Implemented & Customized by: BestBookBuddies   |   Maintained by: Global Library