Erik Lennart Nijkamp

Research interests: generative models, representation learning, unsupervised learning, computer vision, natural-language processing.

Research experience: UCLA, IBM Research, Google AI & Brain, and SalesForce Research Einstein.

Reach out: Twitter GitHub LinkedIn Mail

Preprints

Learning energy-based model with flow-based backbone by neural transport MCMC.
E Nijkamp*, R Gao*, P Sountsov, S Vasudevan, B. Pang, SC Zhu, and YN Wu.
Uner Review, 2020.

Generative Text Modeling through Short Run Inference.
Bo Pang*, Erik Nijkamp*, Tian Han and Ying Nian Wu.
Under Review, 2020.

A Generative Model for Sampling High-Performance and Diverse Weights for Neural Networks.
Lior Deutsch*, Erik Nijkamp*, Song-Chun Zhu, Ying Nian Wu.
Under review. 2019.

Selected Publications

Learning Latent Space Energy-Based Prior Model.
B Pang*, T Han*, E Nijkamp*, SC Zhu, and YN Wu.
NeurIPS, 2020.

Learning Multi-layer Latent Variable Model via Variational Optimization of Short Run MCMC for Approximate Inference.
Erik Nijkamp*, Bo Pang*, Tian Han and Ying Nian Wu.
ECCV, 2020.

Towards Holistic and Automatic Evaluation of Open-Domain Dialogue Generation.
Bo Pang, Erik Nijkamp, Tian Han and Ying Nian Wu.
ACL, 2020.

Flow Contrastive Estimation of Energy-Based Model.
Ruiqi Gao, Erik Nijkamp, Diederik P. Kingma, Zhen Xu, Andrew M. Dai, and Ying Nian Wu.
CVPR, 2020.

Representation Learning: A Statistical Perspective.
Jianwen Xie, Ruiqi Gao, Erik Nijkamp, Song-Chun Zhu, Ying Nian Wu.
Annual Review of Statistics and Its Application, 2019.

On the Anatomy of MCMC-based Maximum Likelihood Learning of Energy-Based Models.
Erik Nijkamp*, Mitch Hill*, Tian Han, Song-Chun Zhu, Ying Nian Wu.
AAAI. 2020.

Flow Contrastive Estimation of Energy-based Model.
Ruiqi Gao, Erik Nijkamp, Zhen Xu, Andrew M. Dai, Diederik P. Kingma, and Ying Nian Wu.
NeurIPS Workshop on Bayesian Deep Learning. 2019.

Learning Non-Convergent Non-Persistent Short-Run MCMC Toward Energy-Based Model.
Erik Nijkamp, Mitch Hill, Song-Chun Zhu, Ying Nian Wu.
NeurIPS. 2019.

Divergence Triangle for Joint Training of Generator Model, Energy-based Model, and Inference Model.
Tian Han*, Erik Nijkamp*, Xiaolin Fang, Mitch Hill, Song-Chun Zhu, Ying Nian Wu.
CVPR. 2019.

Building a Telescope to Look Into High-Dimensional Image Spaces.
Mitch Hill, Erik Nijkamp, Song-Chun Zhu.
QAM. 2018.

Generative Hierarchical Learning of Sparse FRAME Models.
Jianwen Xie, Yifei Xu, Erik Nijkamp, Ying Nian Wu, Song-Chun Zhu.
CVPR. 2017.

Large-scale matrix factorization with distributed stochastic gradient descent.
Rainer Gemulla, Erik Nijkamp, Peter J. Haas, Yannis Sismanis.
ACM KDD 2011: 69-77. 2011.

MapReduce and PACT - Comparing Data Parallel Programming Models.
A. Alexandrov, S. Ewen, M. Heimel, F. Hueske, O. Kao, V. Markl, E. Nijkamp, D. Warneke.
BTW: 25-44. 2011.

Massively Parallel Data Analysis with PACTs on Nephele.
A. Alexandrov, D. Battré, S. Ewen, M. Heimel, F. Hueske, O. Kao, V. Markl, Erik Nijkamp, D. Warneke.
PVLDB 3(2): 1625-1628. 2010.

BinRank: Scaling Dynamic Authority-Based Search Using Materialized SubGraphs.
Heasoo Hwang, Andrey Balmin, Berthold Reinwald, Erik Nijkamp.
ICDE 2009: 66-77. 2009.

DBPubs: Multidimensional exploration of database publications.
Akanksha Baid, Andrey Balmin, Heasoo Hwang, Erik Nijkamp, Jun Rao, Berthold Reinwald, Alkis Simitsis, Yannis Sismanis, Frank van Ham.
PVLDB 2008 1 (2): 1456-1459. 2008.

Awards

Most Promising Statistician Award, 2020.
Dissertation Year Fellowship, 2020.
Extreme Science and Engineering Discovery Environment Grant for Super-Computing, 2019.
IKT Innovative Award. Federal Ministry of Technology. 2012.
IBM Research Pat Goldberg Best Paper Award. IBM Research. 2011.

Patents

Method for Testing an Application. DE102012223587. 2014.
Scaling dynamic authority-based search using materialized subgraphs. US20100223266. 2010.