Skip to content

Washington University (in St. Louis) Course T81-558: Applications of Deep Neural Networks

License

Notifications You must be signed in to change notification settings

yasinyuksel/t81_558_deep_learning

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

T81 558:Applications of Deep Neural Networks

Washington University in St. Louis

Instructor: Jeff Heaton

The content of this course changes as technology evolves, to keep up to date with changes follow me on GitHub.

  • Section 1. Spring 2022, Monday, 2:30 PM, Brauer Hall / 012
  • Section 2. Spring 2022, Online
  • Section 3. Spring 2022, Online

Course Description

Deep learning is a group of exciting new technologies for neural networks. Through a combination of advanced training techniques and neural network architectural components, it is now possible to create neural networks that can handle tabular data, images, text, and audio as both input and output. Deep learning allows a neural network to learn hierarchies of information in a way that is like the function of the human brain. This course will introduce the student to classic neural network structures, Convolution Neural Networks (CNN), Long Short-Term Memory (LSTM), Gated Recurrent Neural Networks (GRU), General Adversarial Networks (GAN) and reinforcement learning. Application of these architectures to computer vision, time series, security, natural language processing (NLP), and data generation will be covered. High Performance Computing (HPC) aspects will demonstrate how deep learning can be leveraged both on graphical processing units (GPUs), as well as grids. Focus is primarily upon the application of deep learning to problems, with some introduction to mathematical foundations. Students will use the Python programming language to implement deep learning using Google TensorFlow and Keras. It is not necessary to know Python prior to this course; however, familiarity of at least one programming language is assumed. This course will be delivered in a hybrid format that includes both classroom and online instruction.

Textbook

I am in the process of creating a textbook for this course. You can find a copy here. If you would like to cite the material from this course/book, please use the following bibtex citation:

@misc{heaton2020applications,
    title={Applications of Deep Neural Networks},
    author={Jeff Heaton},
    year={2020},
    eprint={2009.05673},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

Objectives

  1. Explain how neural networks (deep and otherwise) compare to other machine learning models.
  2. Determine when a deep neural network would be a good choice for a particular problem.
  3. Demonstrate your understanding of the material through a final project uploaded to GitHub.

Syllabus

This syllabus presents the expected class schedule, due dates, and reading assignments. Download current syllabus.

Module Content
Module 1
Meet on 01/24/2022
Module 1: Python Preliminaries
  • Part 1.1: Course Overview
  • Part 1.2: Introduction to Python
  • Part 1.3: Python Lists, Dictionaries, Sets & JSON
  • Part 1.4: File Handling
  • Part 1.5: Functions, Lambdas, and Map/ReducePython Preliminaries
  • We will meet on campus this week! (first meeting) (first online meeting)
Module 2
Week of 01/31/2022
Module 2: Python for Machine Learning
  • Part 2.1: Introduction to Pandas for Deep Learning
  • Part 2.2: Encoding Categorical Values in Pandas
  • Part 2.3: Grouping, Sorting, and Shuffling
  • Part 2.4: Using Apply and Map in Pandas
  • Part 2.5: Feature Engineering in Padas
  • Module 1 Program due: 02/01/2022
  • Icebreaker due: 02/01/2022
Module 3
Week of 02/07/2022
Module 3: TensorFlow and Keras for Neural Networks
  • Part 3.1: Deep Learning and Neural Network Introduction
  • Part 3.2: Introduction to Tensorflow & Keras
  • Part 3.3: Saving and Loading a Keras Neural Network
  • Part 3.4: Early Stopping in Keras to Prevent Overfitting
  • Part 3.5: Extracting Keras Weights and Manual Neural Network Calculation
  • Module 2: Program due: 02/08/2022
Module 4
Week of 02/14/2022
Module 4: Training for Tabular Data
  • Part 4.1: Encoding a Feature Vector for Keras Deep Learning
  • Part 4.2: Keras Multiclass Classification for Deep Neural Networks with ROC and AUC
  • Part 4.3: Keras Regression for Deep Neural Networks with RMSE
  • Part 4.4: Backpropagation, Nesterov Momentum, and ADAM Training
  • Part 4.5: Neural Network RMSE and Log Loss Error Calculation from Scratch
  • Module 3 Program due: 02/15/2022
Module 5
Meet on 02/21/2022
Module 5: Regularization and Dropout
  • Part 5.1: Introduction to Regularization: Ridge and Lasso
  • Part 5.2: Using K-Fold Cross Validation with Keras
  • Part 5.3: Using L1 and L2 Regularization with Keras to Decrease Overfitting
  • Part 5.4: Drop Out for Keras to Decrease Overfitting
  • Part 5.5: Bootstrapping and Benchmarking Hyperparameters
  • Module 4 Program due: 02/22/2022
  • We will meet on campus this week! (second meeting)
Module 6
Week of 02/28/2022
Module 6: CNN for Vision
    Part 6.1: Image Processing in Python
  • Part 6.2: Using Convolutional Networks with Keras
  • Part 6.3: Using Pretrained Neural Networks
  • Part 6.4: Looking at Keras Generators and Image Augmentation
  • Part 6.5: Recognizing Multiple Images with YOLOv5
  • Module 5 Program due: 03/01/2022
Module 7
Week of 03/07/2022
Module 7: Generative Adversarial Networks (GANs)
  • Part 7.1: Introduction to GANS for Image and Data Generation
  • Part 7.2: Train StyleGAN3 with your Own Images
  • Part 7.3: Exploring the StyleGAN Latent Vector
  • Part 7.4: GANS to Enhance Old Photographs Deoldify
  • Part 7.5: GANs for Tabular Synthetic Data Generation
  • Module 6 Assignment due: 03/08/2022
Module 8
Meet on 03/21/2022
Module 8: Kaggle
  • Part 8.1: Introduction to Kaggle
  • Part 8.2: Building Ensembles with Scikit-Learn and Keras
  • Part 8.3: How Should you Architect Your Keras Neural Network: Hyperparameters
  • Part 8.4: Bayesian Hyperparameter Optimization for Keras
  • Part 8.5: Current Semester's Kaggle
  • Module 7 Assignment due: 03/22/2022
  • We will meet on campus this week! (third meeting)
Module 9
Week of 03/28/2022
Module 9: Transfer Learning
  • Part 9.1: Introduction to Keras Transfer Learning
  • Part 9.2: Keras Transfer Learning for Computer Vision
  • Part 9.3: Transfer Learning for NLP with Keras
  • Part 9.4: Transfer Learning for Facial Feature Recognition
  • Part 9.5: Transfer Learning for Style Transfer
  • Module 8 Assignment due: 03/29/2022
Module 10
Week of 04/04/2022
Module 10: Time Series in Keras
  • Part 10.1: Time Series Data Encoding for Deep Learning, TensorFlow and Keras
  • Part 10.2: Programming LSTM with Keras and TensorFlow
  • Part 10.3: Text Generation with Keras and TensorFlow
  • Part 10.4: Image Captioning with Keras and TensorFlow
  • Part 10.5: Temporal CNN in Keras and TensorFlow
  • Module 9 Assignment due: 04/05/2022
Module 11
Week of 04/11/2022
Module 11: Natural Language Processing
  • Part 11.1: Getting Started with Spacy in Python
  • Part 11.2: Word2Vec and Text Classification
  • Part 11.3: Natural Language Processing with Spacy and Keras
  • Part 11.4: What are Embedding Layers in Keras
  • Part 11.5: Learning English from Scratch with Keras and TensorFlow
  • Module 10 Assignment due: 04/12/2022
Module 12
Week of 04/18/2022
Module 12: Reinforcement Learning
  • Kaggle Assignment due: 04/18/2022 (approx 4-6PM, due to Kaggle GMT timezone)
  • Part 12.1: Introduction to the OpenAI Gym
  • Part 12.2: Introduction to Q-Learning for Keras
  • Part 12.3: Keras Q-Learning in the OpenAI Gym
  • Part 12.4: Atari Games with Keras Neural Networks
  • Part 12.5: Application of Reinforcement Learning
Module 13
Week of 04/25/2022
Module 13: Deployment and Monitoring
  • Part 13.1: Flask and Deep Learning Web Services
  • Part 13.2: Interrupting and Continuing Training
  • Part 13.3: Using a Keras Deep Neural Network with a Web Application
  • Part 13.4: When to Retrain Your Neural Network
  • Part 13.5: Tensor Processing Units (TPUs)
Module 14
Meet on 05/02/2022
Module 14: Other Neural Network Techniques
  • Part 14.1: What is AutoML
  • Part 14.2: Using Denoising AutoEncoders in Keras
  • Part 14.3: Training an Intrusion Detection System with KDD99
  • Part 14.4: Anomaly Detection in Keras
  • Part 14.5: New Technology in Deep Learning
  • Final Project due 05/06/2022
  • We will meet on campus this week! (fourth meeting)

Datasets

About

Washington University (in St. Louis) Course T81-558: Applications of Deep Neural Networks

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%