TY - GEN
T1 - Learning Hierarchical Features with Joint Latent Space Energy-Based Prior
AU - Cui, Jiali
AU - Wu, Ying Nian
AU - Han, Tian
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - This paper studies the fundamental problem of multilayer generator models in learning hierarchical representations. The multi-layer generator model that consists of multiple layers of latent variables organized in a top-down architecture tends to learn multiple levels of data abstraction. However, such multi-layer latent variables are typically parameterized to be Gaussian, which can be less informative in capturing complex abstractions, resulting in limited success in hierarchical representation learning. On the other hand, the energy-based (EBM) prior is known to be expressive in capturing the data regularities, but it often lacks the hierarchical structure to capture different levels of hierarchical representations. In this paper, we propose a joint latent space EBM prior model with multi-layer latent variables for effective hierarchical representation learning. We develop a variational joint learning scheme that seamlessly integrates an inference model for efficient inference. Our experiments demonstrate that the proposed joint EBM prior is effective and expressive in capturing hierarchical representations and modelling data distribution.
AB - This paper studies the fundamental problem of multilayer generator models in learning hierarchical representations. The multi-layer generator model that consists of multiple layers of latent variables organized in a top-down architecture tends to learn multiple levels of data abstraction. However, such multi-layer latent variables are typically parameterized to be Gaussian, which can be less informative in capturing complex abstractions, resulting in limited success in hierarchical representation learning. On the other hand, the energy-based (EBM) prior is known to be expressive in capturing the data regularities, but it often lacks the hierarchical structure to capture different levels of hierarchical representations. In this paper, we propose a joint latent space EBM prior model with multi-layer latent variables for effective hierarchical representation learning. We develop a variational joint learning scheme that seamlessly integrates an inference model for efficient inference. Our experiments demonstrate that the proposed joint EBM prior is effective and expressive in capturing hierarchical representations and modelling data distribution.
UR - http://www.scopus.com/inward/record.url?scp=85177454637&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85177454637&partnerID=8YFLogxK
U2 - 10.1109/ICCV51070.2023.00211
DO - 10.1109/ICCV51070.2023.00211
M3 - Conference contribution
AN - SCOPUS:85177454637
T3 - Proceedings of the IEEE International Conference on Computer Vision
SP - 2218
EP - 2227
BT - Proceedings - 2023 IEEE/CVF International Conference on Computer Vision, ICCV 2023
T2 - 2023 IEEE/CVF International Conference on Computer Vision, ICCV 2023
Y2 - 2 October 2023 through 6 October 2023
ER -