Unsupervised Thoughts

12 Week Deep Learning Syllabus

Written on

Research truly is a winding path. I started along a path of interoperability of digital health records looking at protocols, current systems and considering methods for modifying protocols, filesystems with very high level goals of:

  1. Putting control of the full health record including charts, results, diagnostic imaging etc into the hands of the patient. Allowing the patient to control the data with the ability to approve and revoke access across multiple healthcare systems and providers.
  2. Being able to extend existing filesystems, protocols and databases to support #1 so that fork lifting new systems wouldn't be required and the technology could be adopted with minimal cost and change.

As I read literature and consume other media it's impossible to avoid AI right now since it's such a hot topic so I'm going to deviate from my current path and spend some time getting up to speed on AI. What better way to do that than have ChatGPT recommend a syllabus? Here's the recomendation and my goals for the next 12 weeks.

12-Week Deep Learning Learning Path

Overview

Duration: 12 weeks (adjustable)
Goal: Understand core deep learning concepts, build basic projects, and prepare to apply DL in real-world research — especially in healthcare/security contexts.
Outcome: Be ready to build a thesis prototype or dive into applied research using deep learning tools.


Phase 1: Foundations (Weeks 1–4)

Week Focus Resources Milestones
1 What is Deep Learning? + Intro to ANNs Coursera: Neural Networks (Andrew Ng)
YouTube: 3Blue1Brown “Neural Networks”
Train a tiny neural net (e.g. XOR classifier)
2 Loss Functions, Optimizers, Backpropagation Continue Ng’s course
Blog: Gradient Descent Visualized
Understand how a model “learns”
3 Deep Neural Networks + Overfitting Try a basic image classifier (MNIST)
Tool: Google Colab
Use dropout & regularization
4 Convolutional Neural Networks (CNNs) Fast.ai CNN lesson
Visualize filters from a CNN
Train CNN on CIFAR10 or similar

Phase 2: Applied Techniques (Weeks 5–8)

Week Focus Resources Projects
5 Transfer Learning Fine-tune a ResNet using PyTorch
Hugging Face “Transformers for Vision”
Classify a small medical image dataset (e.g. skin lesions)
6 Intro to NLP & Word Embeddings Hugging Face NLP Course
Jay Alammar’s GPT visuals
Use BERT or DistilBERT to classify text
7 Recurrent Neural Networks (RNNs), LSTMs TensorFlow tutorial: text generation
Try GPT-2 in Colab
Generate medical-style notes
8 Transformers & Attention Read “Attention is All You Need” (lite version)
Play with a transformer model (e.g. via Hugging Face)
Modify prompt-tuning for EMR-style text

Phase 3: Domain Application (Weeks 9–12)

Week Focus Resources Project Ideas
9 Federated Learning PySyft (OpenMined) tutorial
Read: “Federated Learning in Healthcare” review
Simulate FL with 2 “hospitals” sharing EMR data
10 Anomaly Detection w/ DL Blog: Autoencoders for anomaly detection
Paper: Deep learning for IDS survey
Train AE to flag unusual device traffic
11 Explainable AI (XAI) SHAP, LIME tutorials
Read: “Explainable DL in Healthcare”
Visualize which features influence model predictions
12 Mini Thesis Draft + Prep Draft short proposal
Submit paper to arXiv or blog findings
Write: intro, lit review, methodology

Tools You’ll Use

  • Google Colab / Jupyter
  • PyTorch or TensorFlow
  • Hugging Face Transformers
  • Scikit-learn / Pandas / NumPy
  • Kaggle for datasets and practice

Optional Reading List

  • Deep Learning by Goodfellow et al.
  • Grokking Deep Learning by Andrew Trask
  • Key Papers:
  • “Attention is All You Need”
  • “Federated Learning in Healthcare”
  • “XAI in Medical AI”