Modes of Homogeneous Gradient Flows

Ido Cohen, Omri Azencot, Pavel Lifshits, and Guy Gilboa.
SIAM Journal on Imaging Sciences, 2021.
Abstract
Finding latent structures in data is drawing increasing attention in diverse fields such as image and signal processing, fluid dynamics, and machine learning. In this work, we examine the problem of finding the main modes of gradient flows. Gradient descent is a fundamental process in optimization where its stochastic version is prominent in the training of neural networks. Here our aim is to establish a consistent theory for gradient flows ψ(t)=P(ψ), where P is a nonlinear homogeneous operator. Our proposed framework stems from analytic solutions of homogeneous flows, previously formalized by Cohen-Gilboa, where the initial condition ψ(0) admits the nonlinear eigenvalue problem P(ψ(0))=λψ(0). We first present an analytic solution for DMD in such cases. We show an inherent flaw of DMD, which is unable to recover the essential dynamics of the flow. It is evident that DMD is best suited for homogeneous flows of degree one. We propose an adaptive time sampling scheme and show that its dynamics are analogous to homogeneous flows of degree one with a fixed step size. Moreover, we adapt DMD to yield a real spectrum, using symmetric matrices. Our analytic solution of the proposed scheme recovers the dynamics perfectly and yields zero error. We then proceed to show that in the general case the orthogonal modes {ϕ(i)} are approximately nonlinear eigenfunctions P(ϕi)≈λ(i)ϕ(i). We formulate Orthogonal Nonlinear Spectral decomposition (OrthoNS), which recovers the essential latent structures of the gradient descent process. Definitions for spectrum and filtering are given, and a Parseval-type identity is shown.
[PDF] [arXiv]