Neural networks and artificial Intelligence are part of our daily lives. However, how a neural network performs a specific computation is not quite clear. Particularly, how a set of weights allows a network to transform a given input into a specific output is an important open question. My research considers a combination complex-valued neural networks and network theory to offer new mathematical approaches to study the physics of neural networks. Importantly, this line of research studies the importance of the spatiotemporal dynamics of neural networks, which have been recently found to play an important role in the computation process. Importantly, this might offer a new path to construct modern neural networks that can avoid the current huge cost of training these systems, and also offer a unified theory to study artificial and biological neural networks.
Main results:
An exact mathematical description of computation in neural networks
R. C. Budzinski*, A. N. Busch* et al., Communications Physics 7 (1), 239, 2024. (link)
Traveling waves enable short-term predictions of naturalistic visual inputs
G. Benigno, R. C. Budzinski et al., Nature Communications 14 (1), 3409, 2023. (link)
Image segmentation with spatiotemporal dynamics
L. H. B. Liboni*, R. C. Budzinski*, A. N. Busch* et al., PNAS 122 (1) e2321319121, 2025. (link)