Unsupervised Thoughts

Deep Learning - Week 4

Written on

Week 4 Material

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization - Week 2

This week covered the following topics:

  • Apply optimization methods such as (Stochastic) Gradient Descent, Momentum, RMSProp and Adam
  • Use random minibatches to accelerate convergence and improve optimization
  • Describe the benefits of learning rate decay and apply it to your optimization

The more I progress on learning the basics of deep learning the more excited I'm getting about getting to reading the research and running my own experiments. There's so much here that's ripe for progress and new ideas. It shouldn't be hard to find a question that's interesting to me and contributes to the research.

My ChatGPT generated syllabus recommended doing the fast.ai CNN lesson which I did not get to due to working on the Coursera course. I may try to get back to this through the week but I'm really eager to just plough through all 5 Coursera courses at this point so that I have enough foundational knowledge to start reading papers. Not enough hours in the day.