Overview
Explore the historical development of neural networks from McCulloch-Pitts neurons to backpropagation, covering key models like perceptrons, ADALINE, Hopfield networks, Boltzmann machines, and multilayer perceptrons.
Syllabus
-
- Introduction to Neural Networks
-- Overview of Neural Networks and Their Importance
-- Brief History and Milestones in Neural Network Development
- The Birth of Neural Networks
-- McCulloch-Pitts Neurons
-- Initial Models and Their Limitations
- Rise of Supervised Learning Models
-- Perceptrons
--- Single-layer Perceptrons
--- Perceptron Learning Algorithm
-- ADALINE (Adaptive Linear Neuron)
--- Delta Rule
--- Differences Between Perceptron and ADALINE
- Hopfield Networks
-- Recurrent Neural Networks and Hopfield's Contribution
-- Hopfield Network Dynamics and Applications
- Boltzmann Machines
-- Introduction to Stochastic Models
-- Energy-Based Models in Neural Networks
-- Restricted Boltzmann Machines
- From Shallow to Deep Learning
-- Multilayer Perceptrons (MLP)
--- Architecture of MLPs
--- Activation Functions
-- Introduction to Backpropagation
--- Training Multilayer Perceptrons
--- Challenges and Solutions in Training
- Advanced Topics in Neural Networks (Optional)
-- Deep Learning and Modern Innovations
-- Other Notable Models and Variations
- Conclusion and Future Directions
-- Recap of Major Developments
-- Emerging Trends in Neural Networks
- Course Wrap-up
-- Summary and Key Takeaways
-- Additional Resources and Further Reading
Taught by
Tags