Syllabus
Introductory Concepts: Perceptron, multilayer perceptron, deep learning as composite functions, Loss functions, activation functions, backpropagation, deep learning frameworks (e.g., TensorFlow, PyTorch, Keras). Optimization Algorithms and Regularization Techniques; Convolutional Neural Networks (CNNs): architecture, pretrained models, transfer learning, backpropagation; Sequence Modeling: Recurrent Neural Networks (RNNs), Long Short-Term Memory (LSTMs), Gated Recurrent Units (GRUs), attention mechanisms, and their backpropagation; Encoderdecoder models. Unsupervised Learning: Autoencoders, including denoising autoencoders and sparse autoencoders. Generative Models: Generative Adversarial networks (GANs), variational autoencoders (VAEs), deep generative models combining GANs and VAEs. Attention Mechanism: Soft vs. hard attention, global vs. local attention, self-attention, Transformers (key, value, query), multi-head attention. Data-Efficient and Resource-Efficient Learning: Few-Shot Learning, zero-shot learning, model pruning, model compression, neural architecture search (NAS). Fusion of Deep Learning with Graphical Models and Reinforcement Learning: Restricted Boltzmann Machines (RBMs), deep belief net, deep reinforcement learning. Graph Neural Networks: Basics, spectral and spatial graph convolutional networks (GCNs). Advanced Topics: Latest trends (e.g., self-supervised learning, applications of transformers beyond NLP), ethical considerations: fairness in AI (bias, transparency, implications of AI decisions on society).
Text Books
References
- Chollet, F. (2017). Deep Learning with Python. Manning Publications.
- Nielsen, M. (2015). Neural Networks and Deep Learning. Determination Press.
- Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning. MIT Press. ISBN: 978-0262035613.
