Home :: Books :: Science  

Arts & Photography
Audio CDs
Audiocassettes
Biographies & Memoirs
Business & Investing
Children's Books
Christianity
Comics & Graphic Novels
Computers & Internet
Cooking, Food & Wine
Entertainment
Gay & Lesbian
Health, Mind & Body
History
Home & Garden
Horror
Literature & Fiction
Mystery & Thrillers
Nonfiction
Outdoors & Nature
Parenting & Families
Professional & Technical
Reference
Religion & Spirituality
Romance
Science

Science Fiction & Fantasy
Sports
Teens
Travel
Women's Fiction
Linear Models: Least Squares and Alternatives (Springer Series in Statistics)

Linear Models: Least Squares and Alternatives (Springer Series in Statistics)

List Price: $89.95
Your Price: $89.95
Product Info Reviews

<< 1 >>

Rating: 4 stars
Summary: linear models and much more by renowned experts
Review: C. R. Rao is one of the most famous statisticians living today. He has written many important books and produced fundamental results in mathematical statistics. Helge Toutenburg is well known for his numerous publications on linear models. As is Rao's style this text is jammed packed with useful theoretical results sometimes difficult to digest because of the concise treatment. I found his classic text "Linear Statistical Inference and Its Applications" that way also. Although Rao is famous for his fundamental research work in the 1940s and 1950s this book is very modern. Rao has always kept abreast on new developments in statistics and related fields.

I bought the book based on dataguru's amazon recommendation and a subsequent email correspondence. I was not disappointed. The book starts out covering the classical linear models and regression but then goes on to cover problems involving fixed and stochastic constraints. Also although Chapter 3 starts out with least squares regression it goes on to cover projection pursuit, censored regression and includes various alternative estimation procedures other than least squares. In the case of colinearity, principal components regression,ridge regression and shrinkage estimators are offered. Nonparametric regression, logistic regression and neural networks are all covered in this amazing Chapter 3.

The text provides a very current and thorough list of relevant references. Other nice features of this second edition include a completely revised and updated chapter on missing data, much of the unusual material in Chapter 3 including the restricted regression and neural networks, Kalman filtering in Chapter 6 and the use of empirical Bayes methods for simultaneous solution of parameter estimates in different linear models in Chapter 4.

This book will be a treasured reference source. I may have to search through it carefully to discover hidden treasures. Rao does that with his conciseness. I found that "Linear Statistical Inference and Its Applications" had a lot more to offer than I first thought. It was a required text for my mathematical statistics course at Stanford but served more as a reference than as a course text. When taking the course I did not find time to use it much. But many years later I looked through it and was amazed at all the deep and important theoretical results that were included in it. I expect the same from this book.

Rating: 4 stars
Summary: linear models and much more by renowned experts
Review: C. R. Rao is one of the most famous statisticians living today. He has written many important books and produced fundamental results in mathematical statistics. Helge Toutenburg is well known for his numerous publications on linear models. As is Rao's style this text is jammed packed with useful theoretical results sometimes difficult to digest because of the concise treatment. I found his classic text "Linear Statistical Inference and Its Applications" that way also. Although Rao is famous for his fundamental research work in the 1940s and 1950s this book is very modern. Rao has always kept abreast on new developments in statistics and related fields.

I bought the book based on dataguru's amazon recommendation and a subsequent email correspondence. I was not disappointed. The book starts out covering the classical linear models and regression but then goes on to cover problems involving fixed and stochastic constraints. Also although Chapter 3 starts out with least squares regression it goes on to cover projection pursuit, censored regression and includes various alternative estimation procedures other than least squares. In the case of colinearity, principal components regression,ridge regression and shrinkage estimators are offered. Nonparametric regression, logistic regression and neural networks are all covered in this amazing Chapter 3.

The text provides a very current and thorough list of relevant references. Other nice features of this second edition include a completely revised and updated chapter on missing data, much of the unusual material in Chapter 3 including the restricted regression and neural networks, Kalman filtering in Chapter 6 and the use of empirical Bayes methods for simultaneous solution of parameter estimates in different linear models in Chapter 4.

This book will be a treasured reference source. I may have to search through it carefully to discover hidden treasures. Rao does that with his conciseness. I found that "Linear Statistical Inference and Its Applications" had a lot more to offer than I first thought. It was a required text for my mathematical statistics course at Stanford but served more as a reference than as a course text. When taking the course I did not find time to use it much. But many years later I looked through it and was amazed at all the deep and important theoretical results that were included in it. I expect the same from this book.

Rating: 5 stars
Summary: A thought-provoking and joy to read book
Review: I recently got a copy of this book (first edition). While I try to look up some result I need at hand (obviously I find it, the most general and accurate answer, a typical use of Rao's book such as his other classic linear inference book), I find myself digging deeper and deeper into other places of the book. While linear model books and courses are typically boring and contain little new, I find all the new and deep results everywhere in this book, and it's a joy and refreshing experience. For example, the discussion of generalized linear model in the context of heteroscedastic linear model is very natural. The chapter on linear and stochastic constraints is a must read for anybody deals with high-dimensional and complex data. The prediction theory is very novel and general. After closing this book, I'm thinking what more can be said about linear models. Obviously they are useful, not obsolete or unrealistic as being often misconceived. The morale is use in proper context and wariness against violations of model assumptions. There are plenty of tests and remedies in this book for the latter. A modern view is that many nonlinear methods can be treated as extensions of linear models such as nonparametric regression (linear smoothers and local polynomial method), neural networks, etc. and the second edition of this book has added substantial materials in this regard. In all, I recomend this book as an excellent textbook for a seond course on linear models, a must read for researchers dealing with some aspects of linear models, and a must-have reference for anyone who needs to check up the most complete and updated results on linear models.


<< 1 >>

© 2004, ReviewFocus or its affiliates