While reparametrizations and symmetries cause a lot of confusion in the investigation of neural networks’ behaviour people did not yet look at it from the perspective of differential geometry. Presented by Agustinus Kristiadi NeurIPS23 paper “The geometry of neural nets’ parameter spaces under reparametrization” fills this blank and shows how using proper Riemannian geometry can help to avoid confusion in such aspects as computation of gradients, flatness or density fitting under reparametrizations.

You can find the presentation that was held here.

Also for additional information please look into the Agustinus’ blog.

Further related work on the topic: