580@2021@IJCAI

Total: 1

#1 On the Parameterized Complexity of Polytree Learning [PDF] [Copy] [Kimi] [REL]

Authors: Niels Grüttemeier ; Christian Komusiewicz ; Nils Morawietz

A Bayesian network is a directed acyclic graph that represents statistical dependencies between variables of a joint probability distribution. A fundamental task in data science is to learn a Bayesian network from observed data. Polytree Learning is the problem of learning an optimal Bayesian network that fulfills the additional property that its underlying undirected graph is a forest. In this work, we revisit the complexity of Polytree Learning. We show that Polytree Learning can be solved in single-exponential FPT time for the number of variables. Moreover, we consider the influence of d, the number of variables that might receive a nonempty parent set in the final DAG on the complexity of Polytree Learning. We show that Polytree Learning is presumably not fixed-parameter tractable for d, unlike Bayesian network learning which is fixed-parameter tractable for d. Finally, we show that if d and the maximum parent set size are bounded, then we can obtain efficient algorithms.