Revisiting Influence Functions for Latent Variable Models using Variational Bayes

Published: 19 Mar 2025, Last Modified: 25 Apr 2025AABI 2025 Workshop TrackEveryoneRevisionsBibTeXCC BY 4.0
Keywords: bayes, influence, sensitivity
TL;DR: We look into the sensitivity of latent variable models to perturbations in their training data
Abstract: Quantifying a model's sensitivity to data is a key tool for model criticism and interpretability. Influence functions are the de-facto method for estimating such quantities. Latent variable models are ubiquitous in modern ML (e.g. mixture of experts, deep generative models), but estimating the influence of individual data points can be challenging due to the rigid structure between observed and latent variables. In previous work, Zhu and Lee (2001) proposed to take a Newton-step on a surrogate function inspired by the Expectation-Maximization (EM) algorithm. This exploits the model's structure to decouple the effect of perturbations to the data such that the influence on different parameters can be measured separately. We present a generalization of this approach from the lens of Variational Bayes that does not have the restrictions of EM and can be used in a wide-variety of settings.
Submission Number: 30
Loading