作者
Tatiana Shpakova
发表日期
2019/2/21
机构
Université Paris sciences et lettres
简介
Probabilistic graphical models encode hidden dependencies between random variables for data modelling. Parameter estimation is a crucial part of handling such probabilistic models. These very general models have been used in plenty of fields such as computer vision, signal processing, natural language processing. We mostly focused on log-supermodular models, which is a specific part of exponential family distributions, where the potential function is assumed to be the negative of a submodular function. This property is handy for maximum a posteriori and parameter learning estimations. Despite the apparent restriction of the model, is covers a broad part of exponential families, since there are plenty of functions that are submodular, e.g., graph cuts, entropy and others. Probabilistic treatment is challenging for most models, however we were able to tackle some of the challenges at least approximately. In this manuscript, we exploit perturb-and-MAP ideas for partition function approximation and efficient parameter learning. Moreover, the problem can be also interpreted as a structure learning task, where each estimated parameter or weight represents the importance of the corresponding term. We propose a way of approximate parameter estimation and inference for models where exact learning and inference is intractable in general case due to the partition function calculation complexity. The first part of the thesis is dedicated to theoretical guarantees. Given the log-supermodular models, we take advantage of the efficient minimization property related to submodularity. Introducing and comparing two existing upper bounds of the partition …
引用总数