美文网首页
Formalizing Learning & Decision

Formalizing Learning & Decision

作者: Ysgc | 来源:发表于2020-01-17 12:21 被阅读0次
naive bayes fully connected bayes net

Area(full bayes) > Area(naive bayes)


best -> which attribute should we split on (according to the heuristic)
tree -> build a new tree on that attribute best

left is better! -> less uncertainty -> less entropy!

efficient way for coding

entropy -> the amount of uncertainty of our distribution

should split on cylinder

unexpandable means all the samples here are stochastic

go all the way to the bottom -> overfit
-> walk back and prune

why go all the way to the bottom?
because sometimes we need multiple features to determine the results
eg. XOR

test set error reduced by pruning: 21% -> 16%

相关文章

网友评论

      本文标题:Formalizing Learning & Decision

      本文链接:https://www.haomeiwen.com/subject/vqeczctx.html