benjamin at rain dot ai
I am a principal research scientist at Rain.
I did a PhD in computer science at Mila (University of Montreal) under the supervision of Yoshua Bengio.
My current research interests lie at the interface of deep learning and physics. I am more particularly interested in physics-based computation and learning.
Much of my current research revolves around equilibrium propagation (EP), a gradient-descent-based optimization framework grounded in physical principles. Compared to the more conventional framework based on automatic differentiation ("backpropagation"), the advantage of EP is that inference and gradient computation are performed using the same physical laws. This feature makes EP a potentially useful framework for the design of energy-efficient ("neuromorphic") hardware for deep learning, by leveraging physics to perform neural network inference and learning more efficiently. For more information, Chapter 2 of my PhD thesis provides a (relatively recent) overview of EP. See also these notes on EP.
Various neuromorphic platforms compatible with EP-training have been proposed, including nonlinear resistor networks, Ising machines, coupled phase oscillators, as well as flow and elastic networks. To speed up research and assess the scalability of these approaches, one mission for the field is to develop efficient simulators for both these neuromorphic platforms and the EP training process. This is the ambition of this early-stage codebase. If you are interested, have comments or ideas, feel free to reach out.
Ultimately, I am also interested in understanding the learning mechanisms of the brain. (Needless to say, however, that more complex frameworks will be needed for that.)