Neural networks have been an active field of research for years, but relatively little is understood of how they work. Specific types of Neural Networks have a layer structure with decreasing width...Show moreNeural networks have been an active field of research for years, but relatively little is understood of how they work. Specific types of Neural Networks have a layer structure with decreasing width which acts like coarse-graining, reminiscent of the renormalization group (RG). We examine the Restricted Boltzmann Machine (RBM) and discuss it’s possible relation to RG. The RBM is trained on the 1D and 2D Ising model, as well as the MNIST dataset. In particular for the 2D Ising model showing a flow towards the critical point Tc ≈ 2.27, opposite to the RG-flow. Examining the behaviour of the RBM on the MNIST dataset shows that sparse datasets can allow multiple fixed points which can be removed by artificially creating new samples. We conclude that this RBM-flow exists due to the multiple relevant length scales at the critical point and we briefly discuss why.Show less