<< 1 >>
Rating:  Summary: Has some value Review: The first step in common applications of probability is to figure out what distribution to use. That depends on the structure of your problem, and is up to you. This book helps with the second step: figuring out what set of values for that distribution's parameters give the best fit to your data. It was certainly interesting to see least squares regression derived using ML, instead of the usual geometric interpretation. It was quite worthwhile to see a mutual dependence metric that works when normal "correlation" doesn't. It also broadened my view, a little, to see standard linear and non-linear solution techniques in a different notation than usual. As you can see by the breadth of topics in this slim (82-page) book, the author covers a good bit of territory tangential to ML - in a larger book, that could have turned into a serious organization problem. About 10 of the book's pages give sample code in the Gauss language. That language isn't in the main stream of engineering computing, but a Matlab or Mathematica user can read it easily enough. I did learn a few useful things from this book - I won't be giving it away. It probably won't suit either the beginner in statistics or the specialist, though. If you're in the middle, like me, it has modest value.
<< 1 >>
|