Different applications of the Jacobian matrix for Machine Learning models part3

Monodeep Mukherjee
2 min readMar 29, 2024

--

Photo by Kelly Sikkema on Unsplash
  1. Training Implicit Networks for Image Deblurring using Jacobian-Free Backpropagation(arXiv)

Author : Linghai Liu, Shuaicheng Tong, Lisa Zhao

Abstract : Recent efforts in applying implicit networks to solve inverse problems in imaging have achieved competitive or even superior results when compared to feedforward networks. These implicit networks only require constant memory during backpropagation, regardless of the number of layers. However, they are not necessarily easy to train. Gradient calculations are computationally expensive because they require backpropagating through a fixed point. In particular, this process requires solving a large linear system whose size is determined by the number of features in the fixed point iteration. This paper explores a recently proposed method, Jacobian-free Backpropagation (JFB), a backpropagation scheme that circumvents such calculation, in the context of image deblurring problems. Our results show that JFB is comparable against fine-tuned optimization schemes, state-of-the-art (SOTA) feedforward networks, and existing implicit networks at a reduced computational cost.

2.Inverse limits of automorphisms of truncated polynomials and applications related to Jacobian conjecture (arXiv)

Author : Hao Chang, Bin Shu, Yu-Feng Yao

Abstract : In this note, we investigate Jacobian conjecture through investigation of automorphisms of polynomial rings in characteristic p. Making use of the technique of inverse limits, we show that under Jacobian condition for a given homomorphism φ of the polynomial ring k[x1,…,xn], if φ preserves the maximal ideals, then φ is an automorphism

--

--

Monodeep Mukherjee

Universe Enthusiast. Writes about Computer Science, AI, Physics, Neuroscience and Technology,Front End and Backend Development