Mind change optimal learning of bayes net structure

Schulte, Oliver, Luo, Wei and Greiner, Russell (2007). Mind change optimal learning of bayes net structure. In: J. G. Carbonell and J. Siekmann, Learning Theory: Proceedings of the 20th Annual Conference on Learning Theory, COLT 2007. The Twentieth Annual Conference on Learning Theory (COLT 2007) as part of the 2007 Federated Computing Research Conference (FCRC), San Diego, California, USA, (187-202). 13-15 June, 2007. doi:10.1007/978-3-540-72927-3_15


Author Schulte, Oliver
Luo, Wei
Greiner, Russell
Title of paper Mind change optimal learning of bayes net structure
Conference name The Twentieth Annual Conference on Learning Theory (COLT 2007) as part of the 2007 Federated Computing Research Conference (FCRC)
Conference location San Diego, California, USA
Conference dates 13-15 June, 2007
Proceedings title Learning Theory: Proceedings of the 20th Annual Conference on Learning Theory, COLT 2007   Check publisher's open access policy
Journal name Lecture Notes in Computer Science   Check publisher's open access policy
Place of Publication Berlin / Heidelberg, Germany
Publisher Springer
Publication Year 2007
Sub-type Fully published paper
DOI 10.1007/978-3-540-72927-3_15
ISBN 9783540729259
ISSN 0302-9743
1611-3349
Editor J. G. Carbonell
J. Siekmann
Volume 4539
Start page 187
End page 202
Total pages 16
Language eng
Abstract/Summary This paper analyzes the problem of learning the structure of a Bayes net (BN) in the theoretical framework of Gold’s learning paradigm. Bayes nets are one of the most prominent formalisms for knowledge representation and probabilistic and causal reasoning. We follow constraint-based approaches to learning Bayes net structure, where learning is based on observed conditional dependencies between variables of interest (e.g., “X is dependent on Y given any assignment to variable Z”). Applying learning criteria in this model leads to the following results. (1) The mind change complexity of identifying a Bayes net graph over variables V from dependency data is , the maximum number of edges. (2) There is a unique fastest mind-change optimal Bayes net learner; convergence speed is evaluated using Gold’s dominance notion of “uniformly faster convergence”. This learner conjectures a graph if it is the unique Bayes net pattern that satisfies the observed dependencies with a minimum number of edges, and outputs “no guess” otherwise. Therefore we are using standard learning criteria to define a natural and novel Bayes net learning algorithm. We investigate the complexity of computing the output of the fastest mind-change optimal learner, and show that this problem is NP-hard (assuming ). To our knowledge this is the first -hardness result concerning the existence of a uniquely optimal Bayes net structure.
Subjects 0899 Other Information and Computing Sciences
Q-Index Code E1
Q-Index Status Provisional Code
Institutional Status Non-UQ

 
Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 2 times in Thomson Reuters Web of Science Article | Citations
Scopus Citation Count Cited 0 times in Scopus Article
Google Scholar Search Google Scholar
Created: Tue, 09 Mar 2010, 08:43:17 EST by Ms Lynette Adams on behalf of Faculty Of Engineering, Architecture & Info Tech