Browse the glossary using this index

Special | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | ALL

B

Picture of Yee Wei Law

Batch normalisation (BatchNorm)

by Yee Wei Law - Saturday, 24 June 2023, 3:32 PM
 

Watch a high-level explanation of BatchNorm:

Watch more detailed explanation of BatchNorm by Prof Ng:

Watch coverage of BatchNorm in Stanford 2016 course CS231n Lecture 5 Part 2:

References

[IS15] S. Ioffe and C. Szegedy, Batch normalization: Accelerating deep network training by reducing internal covariate shift, in Proceedings of the 32nd International Conference on Machine Learning (F. Bach and D. Blei, eds.), Proceedings of Machine Learning Research 37, PMLR, Lille, France, 07–09 Jul 2015, pp. 448–456.
[LWS+17] Y. Li, N. Wang, J. Shi, J. Liu, and X. Hou, Revisiting batch normalization for practical domain adaptation, in ICLR workshop, 2017. Available at https://openreview.net/pdf?id=Hk6dkJQFx.
[STIM18] S. Santurkar, D. Tsipras, A. Ilyas, and A. Madry, How does batch normalization help optimization?, in Advances in Neural Information Processing Systems (S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, eds.), 31, Curran Associates, Inc., 2018. Available at https://proceedings.neurips.cc/paper_files/paper/2018/file/905056c1ac1dad141560467e0a99e1cf-Paper.pdf.
[Zha20] X.-D. Zhang, A Matrix Algebra Approach to Artificial Intelligence, Springer, 2020. https://doi.org/10.1007/978-981-15-2770-8.