The field of deep learning has seen dramatic developments over the last decade. Improved fundamental understanding of deep learning can enhance our ability to train deeper and more advanced...Show moreThe field of deep learning has seen dramatic developments over the last decade. Improved fundamental understanding of deep learning can enhance our ability to train deeper and more advanced networks. We have explored random neural networks. These networks exhibit two phases depending on the weight variances used to initialise them. Through numerical simulations, we are able to confirm the existence of these two phases and the associated emerging depth scales. Furthermore, we characterise the edge of chaos that separates these two phases. We find that significant differences between activation layer distributions on this edge of chaos exist. This raises the question as to whether some points on this edge show improved training performance over others.Show less