Months ago, I had a post about the work by Bromberg and Margaritis on graphical model structure learning. Here is another paper by them, "Efficient and Robust Independence-Based Markov Network Structure Discovery" [IJCAI-07], which seems even more interesting to me.
The paper has two main contributions. The first is making the constraint-based structure learning more efficient, by always choosing the most informative and inexpensive test to perform. The second is making the structure learning robust to unreliable test results. This second goal is the same as in their other work discussed in my previous post, but here they use a totally different method: they update the posterior of the structures given the tests, instead of simply accepting or rejecting structures as in traditional constraint-based methods. Then there's the problem that there are exponentially many structures and you can't keep track of all of them. So their solution is, particle filter!
Remember there are two types of methods for structure learning: score-based and constraint-based (aka. independence-based). Although this method is still constraint-based (because it's based on independence tests), I feel it also resembles score-based methods. By using posterior instead of hard constraints, the problem resembles an optimization problem instead of a constraint satisfaction problem.
I'm currently refreshing/supplementing my knowledge on particle filter, and perhaps my next post will be about it.
Wednesday, December 10, 2008
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment