An Entropy Weighted Nonnegative Matrix Factorization Algorithm for Feature Representation

Jiao Wei, Can Tong, Bingxue Wu, Qiang He, Shouliang Qi, Yudong Yao, Yueyang Teng

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

Nonnegative matrix factorization (NMF) has been widely used to learn low-dimensional representations of data. However, NMF pays the same attention to all attributes of a data point, which inevitably leads to inaccurate representations. For example, in a human-face dataset, if an image contains a hat on a head, the hat should be removed or the importance of its corresponding attributes should be decreased during matrix factorization. This article proposes a new type of NMF called entropy weighted NMF (EWNMF), which uses an optimizable weight for each attribute of each data point to emphasize their importance. This process is achieved by adding an entropy regularizer to the cost function and then using the Lagrange multiplier method to solve the problem. Experimental results with several datasets demonstrate the feasibility and effectiveness of the proposed method. The code developed in this study is available at https://github.com/Poisson-EM/Entropy-weighted-NMF.

Original languageEnglish
Pages (from-to)5381-5391
Number of pages11
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume34
Issue number9
DOIs
StatePublished - 1 Sep 2023

Keywords

  • Clustering
  • entropy regularizer
  • low-dimensional representation
  • nonnegative matrix factorization (NMF)

Fingerprint

Dive into the research topics of 'An Entropy Weighted Nonnegative Matrix Factorization Algorithm for Feature Representation'. Together they form a unique fingerprint.

Cite this