Erik Lennart Nijkamp

About. I’m a Research Scientist at Salesforce Research. Formerly, Ph.D. candidate in the Center for Vision, Cognition, Learning and Autonomy (VCLA) at UCLA, advised by Prof. Song-Chun Zhu and Prof. Ying Nian Wu. I’ve also spent time at IBM Research, Google Research, and Salesforce Einstein Research. My research is generously supported by the UCLA DYF Fellowship, XSEDE extreme science and engineering grant, and the NVIDIA GPU grant.

Research interests. Representation Learning, Generative Models, Unsupervised Learning, Energy Based Models, Variational Approximation, Computer Vision, Natural Language Processing.

Research themes. The governing themes of our research are (i) advance and establish energy-based models and (ii) increase the sample efficiency in the learning of LLMs:

(1) Latent space modelling and sampling.

(2) Variations of MCMC-based learning.

(3) Joint training of EBMs without resorting to MCMC.

(4) Sample efficient learning of large language models.

Selected publications.

  1. arXiv
    Learning Energy-based Model with Flow-based Backbone by Neural Transport MCMC
    Nijkamp, Erik, Gao, Ruiqi, Sountsov, Pavel, Vasudevan, Srinivas, Pang, Bo, Zhu, Song-Chun, and Wu, Ying Nian
    arXiv preprint arXiv:2006.06897 2020
  2. ECCV Spotlight
    Learning Multi-layer Latent Variable Model via Variational Optimization of Short Run MCMC for Approximate Inference
    Nijkamp, Erik, Pang, Bo, Han, Tian, Zhou, Linqi, Zhu, Song-Chun, and Wu, Ying Nian
    European Conference on Computer Vision (ECCV) 2020
  3. AAAI Oral
    On the Anatomy of MCMC-based Maximum Likelihood Learning of Energy-Based Models
    Nijkamp, Erik, Hill, Mitch, Han, Tian, Zhu, Song-Chun, and Wu, Ying Nian
    Association for the Advancement of Artificial Intelligence (AAAI) 2020
  4. NeurIPS
    Learning Latent Space Energy-Based Prior Model
    Pang, Bo*, Han, Tian*, Nijkamp, Erik*, Zhu, Song-Chun, and Wu, Ying Nian
    Advances in Neural Information Processing Systems (NeurIPS) 2020
  5. CVPR Oral
    Flow Contrastive Estimation of Energy-Based Models
    Gao, Ruiqi, Nijkamp, Erik, Kingma, Diederik P, Xu, Zhen, Dai, Andrew M, and Wu, Ying Nian
    Conference on Computer Vision and Pattern Recognition (CVPR) 2020
  6. CVPR
    Joint Training of Variational Auto-Encoder and Latent Energy-Based Model
    Han, Tian, Nijkamp, Erik, Zhou, Linqi, Pang, Bo, Zhu, Song-Chun, and Wu, Ying Nian
    Conference on Computer Vision and Pattern Recognition (CVPR) 2020
  7. CVPR Oral
    Divergence Triangle for Joint Training of Generator model, Energy-based model, and Inferential model
    Han, Tian*, Nijkamp, Erik*, Fang, Xiaolin, Hill, Mitch, Zhu, Song-Chun, and Wu, Ying Nian
    Conference on Computer Vision and Pattern Recognition (CVPR) 2019
  8. NeurIPS
    Learning Non-convergent Non-persistent Short-run MCMC toward Energy-Based Model
    Nijkamp, Erik, Hill, Mitch, Zhu, Song-Chun, and Wu, Ying Nian
    Advances in Neural Information Processing Systems (NeurIPS) 2019