TY - JOUR
T1 - An Iterative Reweighted Method for Tucker Decomposition of Incomplete Tensors
AU - Yang, Linxiao
AU - Fang, Jun
AU - Li, Hongbin
AU - Zeng, Bing
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/9/15
Y1 - 2016/9/15
N2 - We consider the problem of low-rank decomposition of incomplete tensors. Since many real-world data lie on an intrinsically low-dimensional subspace, tensor low-rank decomposition with missing entries has applications in many data analysis problems such as recommender systems and image inpainting. In this paper, we focus on Tucker decomposition which represents an $N$th-order tensor in terms of $N$ factor matrices and a core tensor via multilinear operations. To exploit the underlying multilinear low-rank structure in high-dimensional datasets, we propose a group-based log-sum penalty functional to place structural sparsity over the core tensor, which leads to a compact representation with smallest core tensor. The proposed method is developed by iteratively minimizing a surrogate function that majorizes the original objective function. This iterative optimization leads to an iteratively reweighted least squares algorithm. In addition, to reduce the computational complexity, an over-relaxed monotone fast iterative shrinkage-thresholding technique is adapted and embedded in the iterative reweighted process. The proposed method is able to determine the model complexity (i.e., multilinear rank) in an automatic way. Simulation results show that the proposed algorithm offers competitive performance compared with other existing algorithms.
AB - We consider the problem of low-rank decomposition of incomplete tensors. Since many real-world data lie on an intrinsically low-dimensional subspace, tensor low-rank decomposition with missing entries has applications in many data analysis problems such as recommender systems and image inpainting. In this paper, we focus on Tucker decomposition which represents an $N$th-order tensor in terms of $N$ factor matrices and a core tensor via multilinear operations. To exploit the underlying multilinear low-rank structure in high-dimensional datasets, we propose a group-based log-sum penalty functional to place structural sparsity over the core tensor, which leads to a compact representation with smallest core tensor. The proposed method is developed by iteratively minimizing a surrogate function that majorizes the original objective function. This iterative optimization leads to an iteratively reweighted least squares algorithm. In addition, to reduce the computational complexity, an over-relaxed monotone fast iterative shrinkage-thresholding technique is adapted and embedded in the iterative reweighted process. The proposed method is able to determine the model complexity (i.e., multilinear rank) in an automatic way. Simulation results show that the proposed algorithm offers competitive performance compared with other existing algorithms.
KW - Iterative reweighted method
KW - Tucker decomposition
KW - low rank tensor decomposition
KW - tensor completion
UR - http://www.scopus.com/inward/record.url?scp=84986321437&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84986321437&partnerID=8YFLogxK
U2 - 10.1109/TSP.2016.2572047
DO - 10.1109/TSP.2016.2572047
M3 - Article
AN - SCOPUS:84986321437
SN - 1053-587X
VL - 64
SP - 4817
EP - 4829
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 18
M1 - 7478159
ER -