How Noise contrastive estimation works part6(Machine Learning future)

Monodeep Mukherjee
2 min readApr 23, 2024
  1. Noise-Contrastive Estimation for Multivariate Point Processes(arXiv)

Author : Hongyuan Mei, Tom Wan, Jason Eisner

Abstract : The log-likelihood of a generative model often involves both positive and negative terms. For a temporal multivariate point process, the negative term sums over all the possible event types at each time and also integrates over all the possible times. As a result, maximum likelihood estimation is expensive. We show how to instead apply a version of noise-contrastive estimation — -a general parameter estimation method with a less expensive stochastic objective. Our specific instantiation of this general idea works out in an interestingly non-trivial way and has provable guarantees for its optimality, consistency and efficiency. On several synthetic and real-world datasets, our method shows benefits: for the model to achieve the same level of log-likelihood on held-out data, our method needs considerably fewer function evaluations and less wall-clock time.

2.Noise Contrastive Estimation for Autoencoding-based One-Class Collaborative Filtering (arXiv)

Author : Jin Peng Zhou, Ga Wu, Zheda Mai, Scott Sanner

Abstract : One-class collaborative filtering (OC-CF) is a common class of recommendation problem where only the positive class is explicitly observed (e.g., purchases, clicks). Autoencoder based recommenders such as AutoRec and variants demonstrate strong performance on many OC-CF benchmarks, but also empirically suffer from a strong popularity bias. While a careful choice of negative samples in the OC-CF setting can mitigate popularity bias, Negative Sampling (NS) is often better for training embeddings than for the end task itself. To address this, we propose a two-headed AutoRec to first train an embedding layer via one head using Negative Sampling then to train for the final task via the second head. While this NS-AutoRec improves results for AutoRec and outperforms many state-of-the-art baselines on OC-CF problems, we notice that Negative Sampling can still take a large amount of time to train. Since Negative Sampling is known to be a special case of Noise Contrastive Estimation (NCE), we adapt a recently proposed closed-form NCE solution for collaborative filtering to AutoRec yielding NCE-AutoRec. Overall, we show that our novel two-headed AutoRec models (NCE-AutoRec and NS-AutoRec) successfully mitigate the popularity bias issue and maintain competitive performance in comparison to state-of-the-art recommenders on multiple real-world datasets

--

--

Monodeep Mukherjee

Universe Enthusiast. Writes about Computer Science, AI, Physics, Neuroscience and Technology,Front End and Backend Development