In this post we put together all the building blocks covered in previous posts to create a convolution neural network, using numpy, and test it on the MNIST hand-written digits classification task.
This post will share some knowledge of 2D and 3D convolutions in a
convolution neural network (CNN), and 3 implementations all done using pure `numpy` and `scipy`.
The convolution functions in `scipy` do not work well with missing data. We create a 2D convolution function that allows a controllable tolerance to missing values. It is first implemented in Fortran, then using `scipy` in an FFT approach.