Hey, I agree with a lot of what you are saying here, but it seems unfair to cherry-pick a single method like you are doing here. There's a lot of confusing JavaScript out there, too.
In case this comment dissuades others from checking out Dr. Koller's class, I will add that I found her class challenging and well-constructed. I recommend it.
A year ago, I would have no idea what this was about. I'm very thankful that I live in a world where I can take high quality classes from Coursera that have given me the foundation to at least understand the abstract. :)
I struggle with the mathematics used in neural networks. I can understand code but as soon as I start to see calculus my brain freezes over. Does anyone know of a good online course that can give me a crash course in the mathematics required for neural networks?
My CS bachelors covered this, but I was lazy and drank too much beer. Now 20 years later I want to understand it properly.
I've taken Andrew Ng's Machine Learning class, Daphne Koller's Probabilistic Graphical Models class, Dan Jurafsky and Christopher Manning's Natural Language Processing class, and currently Geoff Hinton's Neural Networks class.
I have spent a lot of time on Khan Academy to learn the calculus. In my experience you can get by with a surprisingly small amount of calculus, but it happens to be a small amount from a high level.
For example, backpropagation is just repeated application of the chain rule. Did take a while to get a handle on the derivatives, but it's worth it.
Do the theano deep learning tutorials. And keep hacking away at the math--you need it, but it eventually sinks in and becomes reasonably intuitive. Starting with code helped me grasp the math (I'm also much more comfortable reading code than math).
If I have learned anything from ml-class, pgm-class, nlp-class, and now neural-nets, is that becoming a data scientist is one of the hardest things I'll ever eventually succeed in doing.
The good man was trying the opposite. Which is to make things as simple as possible for early students and get them do the hardest thing they will ever do: "solve some problems using these tools"
Instead of chasing after some title ("data scientist ..."), find ways of solving some useful problems with whatever you learnt. The article argues that becoming an expert is difficult, which may be true. But that does not mean you don't know enough to start digging at your problems in hand
Yeah, I did a few trial runs on the cluster gpu instances maybe 4 months ago. I found that, while the gpus themselves were really quite fast, moving data in and out of gpu was not. Maybe amazon will focus on increasing bandwidth to the gpu for the new boxes.
To be fair, there should not be a lot of data transfers to and from the GPU. Moving larger chunks of data (instead of many smaller ones) when you are running out of memory, using asynchronous data / compute streams would increase the performance.
"The numbers are, of course, averages. So parents may spend much more time, say, caring for children, while people without children will spend no time at all."
I don't think that would be a classifier, or at least not reasonably. You could have "In Soviet Russie X Y you" for each X,Y as your classes, but that would be unreasonable.
Yakov Smirnoff is a structural joke. You would need to parse sentences, pattern match, transform it, and then do some kind of regression on the phrase to get its humor quotient.
The Stanford Parser for structural parsing, then some custom pattern matching and transforming code, might get you somewhere.
It's in a very early stage (not production ready), but some colleagues and I have been working on a automated testing DSL built on top of RSpec that lets you generate API documentation from tests which validate the documentation is correct. It requires RSpec and a Rack-compatible service (incl. Rails). I think the idea has legs.