pNyodFNPhv@OpenReview

Total: 1

#1 Preconditioned Riemannian Gradient Descent Algorithm for Low-Multilinear-Rank Tensor Completion [PDF] [Copy] [Kimi] [REL]

Authors: Yuanwei Zhang, Fengmiao Bian, Xiaoqun Zhang, Jian-Feng Cai

Tensors play a crucial role in numerous scientific and engineering fields. This paper addresses the low-multilinear-rank tensor completion problem, a fundamental task in tensor-related applications. By exploiting the manifold structure inherent to the fixed-multilinear-rank tensor set, we introduce a simple yet highly effective preconditioned Riemannian metric and propose the Preconditioned Riemannian Gradient Descent (PRGD) algorithm. Compared to the standard Riemannian Gradient Descent (RGD), PRGD achieves faster convergence while maintaining the same order of per-iteration computational complexity. Theoretically, we provide the recovery guarantee for PRGD under near-optimal sampling complexity. Numerical results highlight the efficiency of PRGD, outperforming state-of-the-art methods on both synthetic data and real-world video inpainting tasks.

Subject: ICML.2025 - Poster