array(2) { ["lab"]=> string(3) "666" ["news"]=> string(3) "677" } - 统计物理与神经计算 | LabXing

统计物理与神经计算

简介 Physics, Machine and Intelligence

分享到

Statistical Mechanics of Neural Networks online course

We have published a book about basic tools of statistitical mechanics and applications to understand inner workings of neural networks. The book contains 16 chapters, with extra chapters for a brief history and perspectives for promising frontiers. For details of contents, please visit The Amazon page for the book. The book can also be bought from the Chinese website or 京东商城. If you want to learn more about the book or have any comments or corrections, please email me: huanghp7@mail.sysu.edu.cn

The on-line course about the book is held from Sept, 2022 to June, 2023; it is better to have the book at hand when the course starts.  We have 62 attendees.

Schedule:

  • SMNN05:  982 909 046, Oct 29, 14:00~

Target applicant: undergraduate/graduate students or post-docs.

Prerequisite background: advanced mathematics, probability, linear algebra,python/C language, basics of statistical mechanics (for example this book). Knowledge about neural networks or computational neuroscience is a plus.

Most important point: you show passion toward the theory of neural networks and more generally about the black box of brain.

Tentative course schedule: Every Sat 14:00-15:30 (except Holiday) starting from the third week of Sept, 2022.

Lecture form: details of the book + homework+ papers reading/presentation.

///////////////Back Cover of the Book////////////////

This book covers basic knowledge of statistical mechanics applied to understand inner workings of neural networks. Important concepts or techniques, such as cavity method, mean-field approximation, replica trick, Nishimori condition, variational method, dynamical mean-field theory, unsupervised learning, associative memory models, perceptron models, chaos theory of recurrent neural networks, and eigen- spectrums of neural networks, are introduced in detail, offering a pedagogical guideline for pedestrians who get interested in theory of neural networks. The book focuses on quantitative frameworks of neural network models where the underlying mechanisms can be precisely isolated by physics of mathematical beauty and theoretical predictions.

///////////Content ////////////////

Chapter 1:  Introduction

Chapter 2:  Spin Glass Models and Cavity Method

Chapter 3:  Variational Mean-Field Theory and Belief Propagation

Chapter 4:  Monte-Carlo Simulation Methods

Chapter 5:  High-Temperature Expansion Techniques

Chapter 6: Nishimori Model

Chapter 7: Random Energy Model

Chapter 8:  Statistical Mechanics of Hopfield Model

Chapter 9:  Replica Symmetry and Symmetry Breaking

Chapter 10: Statistical Mechanics of Restricted Boltzmann Machine

Chapter 11: Simplest Model of Unsupervised Learning with Binary Synapses

Chapter 12: Inherent-Symmetry Breaking in Unsupervised Learning

Chapter 13: Mean-Field Theory of Ising Perceptron

Chapter 14: Mean-Field Model of Multi-Layered Perceptron

Chapter 15: Mean-Field Theory of Dimension Reduction in Neural Networks

Chapter 16: Chaos Theory of Random Recurrent Networks

Chapter 17: Statistical Mechanics of Random Matrices

Chapter 18: Perspectives

//////////////Corrections (of typos) in the book///////////////////

1. 

创建: Aug 21, 2020 | 13:42