To Join via Zoom: To join this seminar virtually, please register here.
Abstract: Probabilistic circuits are a recent development in machine learning and act at the intersection of neural networks, traditional probabilistic models like mixture models, and propositional logic. Contrary to traditional representations of probabilistic models, circuits utilize a low-level representation through simple arithmetic operations and allow us to guarantee tractability (exact and efficient computation) of certain inferences based on the properties of the circuit itself. Henceforth, they have become a valuable tool to reason about classes of distributions for which inferences can tractably be represented. In this talk, I will briefly review probabilistic circuits and showcase recent advancements. First, I will introduce probabilistic circuits as a representational tool for tractable probabilistic inference focusing on a broader picture. Second, I will discuss recent endeavours to extend probabilistic circuits to model more complex probability distributions. For this, I will focus on work aimed at representing distributions that do not admit a closed-form density and a recent work focusing on circuits that can subtract density without violating non-negativity constraints aka mixtures with negative weights.