《Foundation of Machine Learning [Part06]》.pdf

  1. 1、本文档共41页,可阅读全部内容。
  2. 2、有哪些信誉好的足球投注网站(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。
  3. 3、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载
  4. 4、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
查看更多
Foundations of Machine Learning Lecture 6 Mehryar Mohri Courant Institute and Google Research mohri@cims.nyu.edu Boosting Mehryar Mohri - Foundations of Machine Learning page 2 Weak Learning (Kearns and Valiant, 1994) Definition: concept class C is weakly PAC-learnable if there exists a (weak) learning algorithm L and γ 0 such that: • for all c ∈ C, 0, δ 0, and all distributions D, 1 Pr R(hS ) ≤ − γ ≥ 1 − δ, S∼D 2 • for samples S of size m =poly (1/δ) for a fixed polynomial. Mehryar Mohri - Foundations of Machine Learning page 3 Boosting Ideas Main idea: use weak learner to create strong learner. Ensemble method: combine base classifiers returned by weak learner. Finding simple relatively accurate base classifiers often not hard. But, how should base classifiers be combined? Mehryar Mohri - Foundations of Machine Learning page 4 AdaBoost (Freund and Schapire, 1997) H ⊆ {−1, +1}X . AdaBoost(S =((x , y ), . . . , (x , y ))) 1 1 m m 1 for i ← 1 to m do 2 D1 (i) ← 1 m 3 for t ← 1 to T do 4 h ← base classifier in H with small error =Pr [h (x ) = y ] t t Dt t i i 5 αt ← 1 log 1−t 2 t 1 6 Z ← 2[ (1 − )] 2 normalization

文档评论(0)

ghfa + 关注
实名认证
内容提供者

该用户很懒,什么也没介绍

1亿VIP精品文档

相关文档