Direct Download:
Build with modern libraries like Tensorflow, Theano, Keras, PyTorch, CNTK, MXNet. Train faster with GPU on AWS
- SnowFiles Link
http://snowfiles.com/l4cjrqg2nvts
What you'll learn
- Apply momentum to backpropagation to train neural networks
- Apply adaptive learning rate procedures like AdaGrad, RMSprop, and Adam to backpropagation to train neural networks
- Understand the basic building blocks of Theano
- Build a neural network in Theano
- Understand the basic building blocks of TensorFlow
- Build a neural network in TensorFlow
- Build a neural network that performs well on the MNIST dataset
- Understand the difference between full gradient descent, batch gradient descent, and stochastic gradient descent
- Understand and implement dropout regularization in Theano and TensorFlow
- Understand and implement batch normalization in Theano and Tensorflow
- Write a neural network using Keras
- Write a neural network using PyTorch
- Write a neural network using CNTK
- Write a neural network using MXNet
Lazy Programmer Inc.
Artificial intelligence and machine learning engineerToday, I spend most of my time as an artificial intelligence and machine learning engineer with a focus on deep learning, although I have also been known as a data scientist, big data engineer, and full stack software engineer.
I received my masters degree in computer engineering with a specialization in machine learning and pattern recognition.
Experience includes online advertising and digital media as both a data scientist (optimizing click and conversion rates) and big data engineer (building data processing pipelines). Some big data technologies I frequently use are Hadoop, Pig, Hive, MapReduce, and Spark.
I've created deep learning models to predict click-through rate and user behavior, as well as for image and signal processing and modeling text.
My work in recommendation systems has applied Reinforcement Learning and Collaborative Filtering, and we validated the results using A/B testing.
I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Hunter College, and The New School.
Multiple businesses have benefitted from my web programming expertise. I do all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies I've used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases I've used MySQL, Postgres, Redis, MongoDB, and more.
Student feedback
Tan Bui
I already studied deep learning in Coursera prior to this course.
Although the theory is similar, this course provided more practical
implementation of the concepts, especially the Grid Search and Random
Search. I did not understand much as well as pay attention to these two
things while studying in Coursera but Lazy Prog has helped me
understanding them very well.
hirai m
For theory, it was easy to understand thanks to his concise but
deliberate explanation. In terms of coding, the lecture was a little
higher paced than his previous pre-requisite courses. Somewhat packed
with a previous pre-requisite codes like review, and new optimization
algo. But, in the end through the patient practice, I could feel I get
used to these algos finally.
Caio Martins Ramos
I've just taken the Deep Learning part 1 course. So far so good. I
really like the fact that Lazy programmer goes into the math. It makes
things much simpler for me. For instance, I was finally able to
understand backpropagation, which is much simpler than I expected at
first, before seeing the math (it makes me wonder why are people so
eager to avoid the math!). As a suggestion, Lazy programmer, you could
have a lecture on the Einstein notation on the appendix of part 1
course. Although not imperative, it made calculating the derivative of
the cost functions a bit less cumbersome for me (you can get easily lost
with all those indexes!).