I am a PhD student at the University of Melbourne, working primarily with Susan Wei. My primary interests lie in developing methods to train probabilistic or Bayesian models, and studying neural networks through the lens of Bayesian statistics. I am a statistician by training, having completed my undergraduate and MPhil studies at the University of Western Australia with Berwin Turlach and Kevin Murray.
Prior to my PhD, I held a statistician/data scientist position (which I still maintain part-time) at the Department of Primary Industries and Regional Development in Western Australia.
SLT combines Bayesian statistics and algebraic geometry to study the properties of singular models, such as neural networks and mixture models; see here and here for an introduction. Unfortunately, my lack of knowledge in algebraic geometry prevents me from contributing to its theoretical development. Instead, I focus on seeking useful tools in SLT and incorporating them into the training workflow for modern deep learning. This endeavor proves to be quite challenging too, especially when dealing with modern networks that are operating in the interpolating regime (number of parameters > data size). This is because most of the key results in SLT only work in the asymptotic regime (data size » number of parameters).
I am always searching for tools that can reliably compute/approximate posteriors, particularly for multimodal or very high-dimensional distributions. Lately, I have been studying Langevin-based Markov Chain Monte Carlo and variational Bayes methods.
Please reach out at kenyon.ng at dpird.wa.gov.au (for DPIRD matter) or kenyon.ng at student.unimelb.edu.au