Full description not available
R**N
Outstanding guide to Machine Learning using Python
I am updating my review of this book because apparently in my first review I didn't do a very good job. This made the review less than useful. I will try to do a better job this time. If it still isn't helpful let me know and I will try again.Like the title says, this book takes an algorithmic approach to teaching machine learning - as opposed to an applied or example based approach. The expectation is that you would get a tutorial on all the main algorithms rather than how to put various algorithms together to solve a particular problem in, say, fraud detection.The Contents reveal the algorithm basis:1. Introduction (types of machine learning, why you would want to do it in the first place and a quick introduction to supervised learning)2. Preliminaries (Key ideas about the problem of over fitting and the what I consider the most important topic: how to test and know when you have a program that has learned something other than the noise). Here the author also covers some ideas about the role of probability. Calling it "turning data into probabilities" is a bit odd, but that's really what we do. Early on he gets the key ideas of the ROC curve out of the way - something many texts just gloss over.I think the secret to understanding machine learning is understanding the idea behind the bias-variance trade-off (it is also handled very well in The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics) which I used to teach a class and read before I read this book.3. Coverage of Artificial Neural Networks starting with the perceptron and why you would want to go beyond linear discriminators4. The multilayer ANN5. Radial Basis Functions and Splines - this is interesting because, Andrew Ng presents a linear regression as the most basic learning algorithm in his Coursera course which means all of the fitting methods, even when not used for classification are relevant.6. Is a section on dimensional reduction - feature selection and other methods like PCA and even factor analysis (most people stop with PCA which I personally think is a mistake, because you can accidentally end up keeping the features with all the noise and throwing out the meaningful linear combinations.7.This is a cool section not seen in basic books on probabilistic methods - sure everyone teaches k-NN, but this one has a nice discussion of Gaussian mixture models8. Talks about the support vector machine. Most people don't get introduced to the idea that ANN and SVMs are actually very similar - they are both large margin classifiers and so knowing something about SVMs will help, even if you end up with some other large margin classifier (with or without kernels)9. This section talks about search and optimization. The way Ng teaches machine learning, you always begin with the error surface, take the derivative and then search for a minimum in the learning function. You quit teaching when you have minimized the error on the training set without out driving the error too high on the validation set - so in a way all these approaches are optimization methods.10. A whole section of genetic algorithms (which I jumped to first) a very clear explanation and a good example that really ran so I could see what was going on.11. Reinforcement learning12. Learning with trees - CART trees end the chapter something everyone working in this area should know something about. He saves random forests for the next section (where I suppose it really belongs)13. This section is on bagging an boosting and then compares the idea of a collection of weak learners (like stubby random trees) as a powerful tool - the idea behind random forests.14. Unsupervised learning. People tend to focus on supervised learning for a very good reason, but there are lots of examples where the cost of putting a label on a data example is too high, so an unsupervised method is a good call.15. Coverage of Markov Chain methods (MCMC) - again this does not get covered in every applied book.16. Graphical models - the Bayesian Network and Probabilistic Network Models along with the hidden Markov models17. Deep belief networks18. Gaussian process regression and classificationThe book concludes with an appendix on Python - getting started etc. I don't think this is quite enough Phython unless you are already pretty familiar with the language.A critic of my first review suggested that I just bashed R and didn't talk about the book - not a completely unfair statement. R keeps data in data frames and Python is much more list and directory based. Data frames and collections are related and there are ways to do list comprehension in both languages. and Python has a data frame package to make using R-like constructs easier if you happen to be coming from R and like them (the package is called Pandas) Both are good languages, but I will stand by my original statement that R is a statistical language at its core. Many of the packages are written in C so they are fast (like the ones written for Python). It has been my experience that the open source tools for R development are just what the commentator said: they are adequate. I my humble opinion, R-Studio has a lot of catching up to do to be as good as professional tools like the JetBrains stuff (PyCharm). Look at MatLab compared to Octave. At lest the community version of PyCharm is free. R-Studio is not fast, and he dirty secret of R is that everything you do has to fit in memory at once, so you have to be very careful with memory management or you will run out and R will crash - it happens to me everyday. Almost all of the ML methods are statistically based so R and all the books (like An Introduction to Statistical Learning: with Applications in R (Springer Texts in Statistics) are totally brilliant. But if you want to see what it is under the hood, I suggest you look at Advanced R (Chapman & Hall/CRC The R Series). This will give you a deep dive on the internals. Compare it to Python Scripting for Computational Science (Texts in Computational Science and Engineering) and make the call for yourself.I have used both R and Python for both prototyping advanced algorithms and putting code live in production. What tipped the scale for me was the productivity. Now that the data science community has started building state-of-art tools for Python, (not to say anything negative about the statistics community who put all of machine learning on a solid footing), I prefer a rapid development language with good tools, test-first frameworks and solid software engineering practices as part of the culture. The book reviewed here allows you to learn almost all of the algorithms used for machine learning and in the end you will be able to produce fast, readable, testable code.
H**I
I don't think it is a good book. The author put much effort to using ...
I don't think it is a good book. The author put much effort to using many examples and analogies to make it more easy to comprehensible. However he seems fail to illustrate those real theory clearly and completely.
R**Y
Great machine learning book for those with rusty match skills or less math inclined.
For those diving into machine learning who are rusty at math or not a math expert this is a solid, understandable book on the topic. It covers a wide variety of machine learning algorithms, and while it does include some math, the math isn't the primary and only focus like other books on the topic. The math sections are less involved, giving the formulas and basic information the various algorithms are based on, but most importantly it's accompanied by easier to understand explanations and pseudo code/actual code implementations. I certainly hope the author continues to update and maintain this book over time - I've shown this book to coworkers who are also less math inclined and they liked the way the book was written and were interested in picking up their own copies.
M**T
Go ML book for my class.
I use this book in my Machine learning class. Really good implementation focused view of the innards (algorithms) in machine learning.Like his style. While using python is great, his use of numpy leads to the occasional confusion. But 2nd edition is definitely betterthan the first which was great. I am looking forward to teaching from this book for years to come.
M**D
Impressive speedy service and product in mint condition
This was the best transaction I've had! I've ordered and got my book delivered to Kuwait in like less than 2 days!Getting my parcel that early was exciting enough until I've opened the package. The book was in a perfect brand new condition and it's as if I just got it out of the oven!I am amazed by the level of service and I would seek them out to buy more books in the future.Thank you!
K**O
a easy-reading book
Generally, it's a good book. But there are some errors and some math notations are confusing.
Trustpilot
1 month ago
1 day ago