Overview Ensemble Model Director of TEAMLAB Sungchul Choi
Ensemble Model - 하나의모델이아니라여러개모델의투표로 Y값예측 - Regression 문제에서는평균값을예측함 - meta-classifier - stacking (meta-ensemble) 등으로발전 - 학습은오래걸리나성능이매우좋음 - Kaggle 의대세기법 (structed dataset)
Keywords - Vallila ensemble - Boosting - Bagging - Adaptive boosting (AdaBoost) - XGBoost - Light GBM
Human knowledge belongs to the world.
Voting classifier Ensemble Model Director of TEAMLAB Sungchul Choi
Voting classifier - 가장기본적인 Ensemble classifier - 여러개의 Model 의투표를통해최종선택을실시 - Majority voting or Vallila Ensemble 모델이라고부름
Test Instance Classifier 1 Classifier 2 Classifier 3 Classifier 4 True True True True True
sklearn.ensemble.votingclassifier Hard voting 의합 Sotf 확률의합
http://slideplayer.com/slide/9261331/
Template clf1 = LogisticRegression(random_state=1) clf2 = DecisionTreeClassifier(random_state=1) clf3 = GaussianNB() eclf = VotingClassifier( estimators=[('lr', clf1), ('rf', clf2), ('gnb', clf3)], voting='hard')
Human knowledge belongs to the world.
Bagging Ensemble Model Director of TEAMLAB Sungchul Choi
Sampling? - 단순히같은데이터셋으로만드는 Classifier? - 같은 dataset으로만든여러개의 tree? - Dataset 은모데이터의 Sampling의불과! - 다양한 sampling dataset으로다양한 classifier 를만들자 - Sampling à 다양한데이터셋에강한 robust classifier
Bootstrapping - 학습데이터에서임의의복원추출 à Subset 학습데이터 n 개를추출하는것 https://goo.gl/u7j95z
Bootstrap? - 신발에서혼자신을수있게끔돕는끈 - 일반적으로외부 Input 없이처음시작하는일을지칭 - 컴퓨터를처음동작할때 memory - 데이터를외부추가없이추출하는것
<latexit sha1_base64="ls/bhagtftw9d55b+nxhbzwb2vk=">aaaci3icbvblswmxgmzwv62vvy9egkwoh5anciogfl14rgaf0n2wbdbbhmyfjfmxlptfvphxvhhqihcp/hftdg/aohayzmxh8o0bcyavzx0zhzxvtfwn4mzpa3tnd8/cp2jjkbgenknei9fxsaschbspmok0ewuka5fttju6nfrtryoki8ihny6pe+bbyhxgsnjs37xcvtswkvfrty8wsvgwetnpnaoukd3pxrfopkfupb20irk+wbzq1gxwmacclegort+c2f5ekocginaszrdzsxjslbqjngylo5e0xmseb7sraygdkp10tmmgt7tiqt8s+oqkzttfeykopbwhrk4gwa3lojcv//o6ifivnzsfcajosoyp+qmhkoltwqdhbcwkjzxbrdd9v0igwbejdk0lxqjaxhmztm5qykqh+/ny/savowiowdgoaaquqb3cgqzoagkewst4bx/gi/fmtizpebrg5doh4a+m7x/zmkkg</latexit> <latexit sha1_base64="ls/bhagtftw9d55b+nxhbzwb2vk=">aaaci3icbvblswmxgmzwv62vvy9egkwoh5anciogfl14rgaf0n2wbdbbhmyfjfmxlptfvphxvhhqihcp/hftdg/aohayzmxh8o0bcyavzx0zhzxvtfwn4mzpa3tnd8/cp2jjkbgenknei9fxsaschbspmok0ewuka5fttju6nfrtryoki8ihny6pe+bbyhxgsnjs37xcvtswkvfrty8wsvgwetnpnaoukd3pxrfopkfupb20irk+wbzq1gxwmacclegort+c2f5ekocginaszrdzsxjslbqjngylo5e0xmseb7sraygdkp10tmmgt7tiqt8s+oqkzttfeykopbwhrk4gwa3lojcv//o6ifivnzsfcajosoyp+qmhkoltwqdhbcwkjzxbrdd9v0igwbejdk0lxqjaxhmztm5qykqh+/ny/savowiowdgoaaquqb3cgqzoagkewst4bx/gi/fmtizpebrg5doh4a+m7x/zmkkg</latexit> <latexit sha1_base64="ls/bhagtftw9d55b+nxhbzwb2vk=">aaaci3icbvblswmxgmzwv62vvy9egkwoh5anciogfl14rgaf0n2wbdbbhmyfjfmxlptfvphxvhhqihcp/hftdg/aohayzmxh8o0bcyavzx0zhzxvtfwn4mzpa3tnd8/cp2jjkbgenknei9fxsaschbspmok0ewuka5fttju6nfrtryoki8ihny6pe+bbyhxgsnjs37xcvtswkvfrty8wsvgwetnpnaoukd3pxrfopkfupb20irk+wbzq1gxwmacclegort+c2f5ekocginaszrdzsxjslbqjngylo5e0xmseb7sraygdkp10tmmgt7tiqt8s+oqkzttfeykopbwhrk4gwa3lojcv//o6ifivnzsfcajosoyp+qmhkoltwqdhbcwkjzxbrdd9v0igwbejdk0lxqjaxhmztm5qykqh+/ny/savowiowdgoaaquqb3cgqzoagkewst4bx/gi/fmtizpebrg5doh4a+m7x/zmkkg</latexit> <latexit sha1_base64="ls/bhagtftw9d55b+nxhbzwb2vk=">aaaci3icbvblswmxgmzwv62vvy9egkwoh5anciogfl14rgaf0n2wbdbbhmyfjfmxlptfvphxvhhqihcp/hftdg/aohayzmxh8o0bcyavzx0zhzxvtfwn4mzpa3tnd8/cp2jjkbgenknei9fxsaschbspmok0ewuka5fttju6nfrtryoki8ihny6pe+bbyhxgsnjs37xcvtswkvfrty8wsvgwetnpnaoukd3pxrfopkfupb20irk+wbzq1gxwmacclegort+c2f5ekocginaszrdzsxjslbqjngylo5e0xmseb7sraygdkp10tmmgt7tiqt8s+oqkzttfeykopbwhrk4gwa3lojcv//o6ifivnzsfcajosoyp+qmhkoltwqdhbcwkjzxbrdd9v0igwbejdk0lxqjaxhmztm5qykqh+/ny/savowiowdgoaaquqb3cgqzoagkewst4bx/gi/fmtizpebrg5doh4a+m7x/zmkkg</latexit> <latexit sha1_base64="6wryusouxvndbt7ffpn4dra30pi=">aaab8nicbvbns8naej3ur1q/qh69bivgqsqi6lhoxwmfwwttkjvnpl262q27e6ge/awvhhtx6q/x5r9x2+agrq8ghu/nmdmvtau36hnftmvtfwnzq7pd29nd2z+ohx51jco0zr2qhnk9kbgmugqd5chyl9wmjkfgj+hkduy/pjftujipoe1zkjcr5dgnbk3uh8sa0nwv8qgy1hte05vdxsv+srpqoj2sfw0irboesascgnp3vrsdngjkvlcinsgmswmdkbhrwypjwkyqz08u3dorrg6stc2j7lz9pzgtxjhpetrohodylhsz8t+vn2f8hercphkysrel4ky4qnzz/27enamoppyqqrm91avjylnam1lnhuavv7xkuhdn32v695en1k0zrxvo4btowycramedtkedfbq8wyu8oei8oo/ox6k14pqzx/ahzucphtqrzq==</latexit> <latexit sha1_base64="6wryusouxvndbt7ffpn4dra30pi=">aaab8nicbvbns8naej3ur1q/qh69bivgqsqi6lhoxwmfwwttkjvnpl262q27e6ge/awvhhtx6q/x5r9x2+agrq8ghu/nmdmvtau36hnftmvtfwnzq7pd29nd2z+ohx51jco0zr2qhnk9kbgmugqd5chyl9wmjkfgj+hkduy/pjftujipoe1zkjcr5dgnbk3uh8sa0nwv8qgy1hte05vdxsv+srpqoj2sfw0irboesascgnp3vrsdngjkvlcinsgmswmdkbhrwypjwkyqz08u3dorrg6stc2j7lz9pzgtxjhpetrohodylhsz8t+vn2f8hercphkysrel4ky4qnzz/27enamoppyqqrm91avjylnam1lnhuavv7xkuhdn32v695en1k0zrxvo4btowycramedtkedfbq8wyu8oei8oo/ox6k14pqzx/ahzucphtqrzq==</latexit> <latexit sha1_base64="6wryusouxvndbt7ffpn4dra30pi=">aaab8nicbvbns8naej3ur1q/qh69bivgqsqi6lhoxwmfwwttkjvnpl262q27e6ge/awvhhtx6q/x5r9x2+agrq8ghu/nmdmvtau36hnftmvtfwnzq7pd29nd2z+ohx51jco0zr2qhnk9kbgmugqd5chyl9wmjkfgj+hkduy/pjftujipoe1zkjcr5dgnbk3uh8sa0nwv8qgy1hte05vdxsv+srpqoj2sfw0irboesascgnp3vrsdngjkvlcinsgmswmdkbhrwypjwkyqz08u3dorrg6stc2j7lz9pzgtxjhpetrohodylhsz8t+vn2f8hercphkysrel4ky4qnzz/27enamoppyqqrm91avjylnam1lnhuavv7xkuhdn32v695en1k0zrxvo4btowycramedtkedfbq8wyu8oei8oo/ox6k14pqzx/ahzucphtqrzq==</latexit> <latexit sha1_base64="6wryusouxvndbt7ffpn4dra30pi=">aaab8nicbvbns8naej3ur1q/qh69bivgqsqi6lhoxwmfwwttkjvnpl262q27e6ge/awvhhtx6q/x5r9x2+agrq8ghu/nmdmvtau36hnftmvtfwnzq7pd29nd2z+ohx51jco0zr2qhnk9kbgmugqd5chyl9wmjkfgj+hkduy/pjftujipoe1zkjcr5dgnbk3uh8sa0nwv8qgy1hte05vdxsv+srpqoj2sfw0irboesascgnp3vrsdngjkvlcinsgmswmdkbhrwypjwkyqz08u3dorrg6stc2j7lz9pzgtxjhpetrohodylhsz8t+vn2f8hercphkysrel4ky4qnzz/27enamoppyqqrm91avjylnam1lnhuavv7xkuhdn32v695en1k0zrxvo4btowycramedtkedfbq8wyu8oei8oo/ox6k14pqzx/ahzucphtqrzq==</latexit> <latexit sha1_base64="gli0+x/t6jrjfjrwtueyoxsynzk=">aaab9hicbvbns8naej3ur1q/qh69lbbbiyurgh6lxjxwsb/qhrlzbnqlm03c3rrkyo/w4kerr/4yb/4bt20o2vpg4pheddpz/erwbvz32ymtrw9sbpw3kzu7e/sh1cojto5trvmlxijwxz9ojrhklconyn1emrl5gnx88d3m70yy0jywj2aamc8iq8ldtomxkocv+qeinmn5fusdas2tu3ogvyiluomczuh1qx/eni2ynfqqrxvytyyxewu4fsyv9fpneklhzmh6lkosme1l86nzdgavaiwxsiunmqu/jzisat2nfnszetpsy95m/m/rpsa88tiuk9qwsrelwlqge6nzaijgilejppyqqri9fdersskym1pfhocxx14l7cs6duv44arwuc3ikmmjnmi5yligbtxde1pa4qme4rxeninz4rw7h4vwklpmhmmfoj8/ye6r1w==</latexit> <latexit sha1_base64="gli0+x/t6jrjfjrwtueyoxsynzk=">aaab9hicbvbns8naej3ur1q/qh69lbbbiyurgh6lxjxwsb/qhrlzbnqlm03c3rrkyo/w4kerr/4yb/4bt20o2vpg4pheddpz/erwbvz32ymtrw9sbpw3kzu7e/sh1cojto5trvmlxijwxz9ojrhklconyn1emrl5gnx88d3m70yy0jywj2aamc8iq8ldtomxkocv+qeinmn5fusdas2tu3ogvyiluomczuh1qx/eni2ynfqqrxvytyyxewu4fsyv9fpneklhzmh6lkosme1l86nzdgavaiwxsiunmqu/jzisat2nfnszetpsy95m/m/rpsa88tiuk9qwsrelwlqge6nzaijgilejppyqqri9fdersskym1pfhocxx14l7cs6duv44arwuc3ikmmjnmi5yligbtxde1pa4qme4rxeninz4rw7h4vwklpmhmmfoj8/ye6r1w==</latexit> <latexit sha1_base64="gli0+x/t6jrjfjrwtueyoxsynzk=">aaab9hicbvbns8naej3ur1q/qh69lbbbiyurgh6lxjxwsb/qhrlzbnqlm03c3rrkyo/w4kerr/4yb/4bt20o2vpg4pheddpz/erwbvz32ymtrw9sbpw3kzu7e/sh1cojto5trvmlxijwxz9ojrhklconyn1emrl5gnx88d3m70yy0jywj2aamc8iq8ldtomxkocv+qeinmn5fusdas2tu3ogvyiluomczuh1qx/eni2ynfqqrxvytyyxewu4fsyv9fpneklhzmh6lkosme1l86nzdgavaiwxsiunmqu/jzisat2nfnszetpsy95m/m/rpsa88tiuk9qwsrelwlqge6nzaijgilejppyqqri9fdersskym1pfhocxx14l7cs6duv44arwuc3ikmmjnmi5yligbtxde1pa4qme4rxeninz4rw7h4vwklpmhmmfoj8/ye6r1w==</latexit> <latexit sha1_base64="gli0+x/t6jrjfjrwtueyoxsynzk=">aaab9hicbvbns8naej3ur1q/qh69lbbbiyurgh6lxjxwsb/qhrlzbnqlm03c3rrkyo/w4kerr/4yb/4bt20o2vpg4pheddpz/erwbvz32ymtrw9sbpw3kzu7e/sh1cojto5trvmlxijwxz9ojrhklconyn1emrl5gnx88d3m70yy0jywj2aamc8iq8ldtomxkocv+qeinmn5fusdas2tu3ogvyiluomczuh1qx/eni2ynfqqrxvytyyxewu4fsyv9fpneklhzmh6lkosme1l86nzdgavaiwxsiunmqu/jzisat2nfnszetpsy95m/m/rpsa88tiuk9qwsrelwlqge6nzaijgilejppyqqri9fdersskym1pfhocxx14l7cs6duv44arwuc3ikmmjnmi5yligbtxde1pa4qme4rxeninz4rw7h4vwklpmhmmfoj8/ye6r1w==</latexit>.632 bootstrap - 일반적으로전체데이터 S에서 n번데이터를추출할때 - 각데이터가나타날확률이 0.632 라는것에서유래 1 d 1 1 d 1 Y (1 1 d )=1 (1 1 d )d 1 e 1 https://stats.stackexchange.com/questions/96739/what-is-the-632-rule-in-bootstrapping
Bagging - Bootstrap Aggregation - Bootstrap 의 subset sample 로모델 n개를학습 à 앙상블 - 기존앙상블과달리하나의모델에다양한데이터대입 - High variance(overfitting 이심함 ) 모델이적합 - Regressor( 평균 or median), Classifier 모두존재
http://manish-m.com/wpcontent/uploads/2012/11/baggingcropped.png
Out of bag error - OOB error estimation - Bagging 실행시, bag에미포함데이터로성능측정 - Validation set 을처리하는방법과유사 - Bagging 성능측적을위한좋은지표
sklearn.ensemble.baggingclassifier base_estimator : object or None, optional (default=none) n_estimators : int, optional (default=10) max_samples : int or float, optional (default=1.0) max_features : int or float, optional (default=1.0) bootstrap : boolean, optional (default=true) bootstrap_features : boolean, optional (default=false) oob_score : bool warm_start : bool, optional (default=false) n_jobs : int, optional (default=1)
sklearn.ensemble.baggingregressor base_estimator : object or None, optional (default=none) n_estimators : int, optional (default=10) max_samples : int or float, optional (default=1.0) max_features : int or float, optional (default=1.0) bootstrap : boolean, optional (default=true) bootstrap_features : boolean, optional (default=false) oob_score : bool warm_start : bool, optional (default=false) n_jobs : int, optional (default=1)
Template clf = DecisionTreeClassifier(random_state=1) eclf = BaggingClassifier(clf, oob_score=true) params ={ "n_estimators" : [10,20,30,40,50,55], "max_samples" : [0.5,0.6,0.7,0.8,0.9,1] } grid = GridSearchCV(estimator=eclf, param_grid=params, cv=5)
Human knowledge belongs to the world.
Random Forest Ensemble Model Director of TEAMLAB Sungchul Choi
Random Forest - Bagging + Randomized decision tree - Variance 가높은 decision tree들의 ensemble - 여러개의나무 à Forest - 가장간단하면서높은성능을자랑하는대표적인모델 - Regressor 와 Classifier 모두지원
Random Forest https://www.researchgate.net/figure/classification-process-based-on-the-random-forest-algorithm-2_fig1_324517994
Random Forest - correlation 낮은 m 개의 subset data 로학습 - Tree 의구성은 binary 로구성 - Split 시검토대상 feature 를 random 하게 n 개선정 - 전체 feature 를 p 라할때, n = p 이면 bagging tree - Feature 의재사용이가능, n은 p 또는 p 3 - Variance 가높은트리 à last node 1 ~ 5
sklearn.ensemble.randomforestclassifier
sklearn.ensemble.randomforestregressor
Human knowledge belongs to the world.
Adaptive Boosting Ensemble Model Director of TEAMLAB Sungchul Choi
Boosting - 간단하면서도성능이매우높은앙상블기법 - 학습 Round 를진행하면서모델을생성 à 모델에의해각 Instance의 weight 를업데이트 - Instance weight가높은 instance 를중심으로모델을생성 - 해당모델들로앙상블모델을만듦 (meta-classifier) - 잘못분류된데이터를더잘분류해보자
Adaboost - Adapative Boosting - 매라운드마다 instance 의 weight 값을계산 - 틀리는 instance 의 weight up à weight 기준 resampling - Instance weight 합이클수록, model 의 weight 를줄임 - 기본분류기에입력값을변화시켜새로운분류기를만듦 - high-depth tree, NN에는적합치않음
<latexit sha1_base64="0soxrtgxz+enpkhqt9gqu2gtimu=">aaacz3icbvhrshwxfm1mbwthw7e1incxq0vlcrlmfkf9ear9ql4pucrslemme2cnzjlt5i64doth+uz7x/oxza4j6g4vbe7opecmoulljs2f4b3nv1h6+er18ptgzfxtu7xw+w9ntqimwj4ovgeuum5rsy09kqtwojti81thexr1c9o/v0zjzafpavziiocjltmpodkqad0edcaj3ivfsd65setoduxdabcnojk6fm6ynbg9qahfica8ovpmmafncsphf8txtbqtimknv+d0mephmzpptcnuoctybfed2qyp46r1fw8luewosshubt8ksxru3jaucidbxfksubjii+w7qhmodldpcpraz8cmisumw5pgxj511dy3dpyntplzurtzvsn5v16/ouz7oja6rai1edgoqxrqadpqysgncljjb7gw0t0vxcu3xjd7mscfem0/ergcfe1gytc62wsf/gjiwgaf2dbrsih9ywfskb2zhhpsjxd4695h76+/5m/4mw9s32s86+xz+vv/abqusoq=</latexit> <latexit sha1_base64="0soxrtgxz+enpkhqt9gqu2gtimu=">aaacz3icbvhrshwxfm1mbwthw7e1incxq0vlcrlmfkf9ear9ql4pucrslemme2cnzjlt5i64doth+uz7x/oxza4j6g4vbe7opecmoulljs2f4b3nv1h6+er18ptgzfxtu7xw+w9ntqimwj4ovgeuum5rsy09kqtwojti81thexr1c9o/v0zjzafpavziiocjltmpodkqad0edcaj3ivfsd65setoduxdabcnojk6fm6ynbg9qahfica8ovpmmafncsphf8txtbqtimknv+d0mephmzpptcnuoctybfed2qyp46r1fw8luewosshubt8ksxru3jaucidbxfksubjii+w7qhmodldpcpraz8cmisumw5pgxj511dy3dpyntplzurtzvsn5v16/ouz7oja6rai1edgoqxrqadpqysgncljjb7gw0t0vxcu3xjd7mscfem0/ergcfe1gytc62wsf/gjiwgaf2dbrsih9ywfskb2zhhpsjxd4695h76+/5m/4mw9s32s86+xz+vv/abqusoq=</latexit> <latexit sha1_base64="0soxrtgxz+enpkhqt9gqu2gtimu=">aaacz3icbvhrshwxfm1mbwthw7e1incxq0vlcrlmfkf9ear9ql4pucrslemme2cnzjlt5i64doth+uz7x/oxza4j6g4vbe7opecmoulljs2f4b3nv1h6+er18ptgzfxtu7xw+w9ntqimwj4ovgeuum5rsy09kqtwojti81thexr1c9o/v0zjzafpavziiocjltmpodkqad0edcaj3ivfsd65setoduxdabcnojk6fm6ynbg9qahfica8ovpmmafncsphf8txtbqtimknv+d0mephmzpptcnuoctybfed2qyp46r1fw8luewosshubt8ksxru3jaucidbxfksubjii+w7qhmodldpcpraz8cmisumw5pgxj511dy3dpyntplzurtzvsn5v16/ouz7oja6rai1edgoqxrqadpqysgncljjb7gw0t0vxcu3xjd7mscfem0/ergcfe1gytc62wsf/gjiwgaf2dbrsih9ywfskb2zhhpsjxd4695h76+/5m/4mw9s32s86+xz+vv/abqusoq=</latexit> <latexit sha1_base64="0soxrtgxz+enpkhqt9gqu2gtimu=">aaacz3icbvhrshwxfm1mbwthw7e1incxq0vlcrlmfkf9ear9ql4pucrslemme2cnzjlt5i64doth+uz7x/oxza4j6g4vbe7opecmoulljs2f4b3nv1h6+er18ptgzfxtu7xw+w9ntqimwj4ovgeuum5rsy09kqtwojti81thexr1c9o/v0zjzafpavziiocjltmpodkqad0edcaj3ivfsd65setoduxdabcnojk6fm6ynbg9qahfica8ovpmmafncsphf8txtbqtimknv+d0mephmzpptcnuoctybfed2qyp46r1fw8luewosshubt8ksxru3jaucidbxfksubjii+w7qhmodldpcpraz8cmisumw5pgxj511dy3dpyntplzurtzvsn5v16/ouz7oja6rai1edgoqxrqadpqysgncljjb7gw0t0vxcu3xjd7mscfem0/ergcfe1gytc62wsf/gjiwgaf2dbrsih9ywfskb2zhhpsjxd4695h76+/5m/4mw9s32s86+xz+vv/abqusoq=</latexit> Adaboost 값샘플의 weight 를 1/N 로초기화 M 은모델의갯수 N 은데이터의갯수 Weight 값을기준으로분류기생성 (resampling) 해당분류기의에러계산 I(y i,g m (x i )) = ( 0 if y i = G m (x i ) 1 if y i 6= G m (x i ) 해당분류기의분류기가중치생성 Instance 의 weight 업데이트
Adaboost 값샘플의 weight 를 1/N 로초기화 M 은모델의갯수 N 은데이터의갯수 Weight 값을기준으로분류기생성 (resampling) 해당분류기의에러계산 분류기의 weight
값샘플의 weight를 1/d로초기화 k는모델의갯수 D는데이터의갯수데이터샘플링 Error 가 0.5 이상은해당모델을버림 Update 방법이다름 Alpha 값이 weight
Adaboost https://infinitescript.com/wordpress/wpcontent/uploads/2016/09/adaboost.jpg
Adaboost with stump - Adaboost의 classifier를 1-depth tree를 사용하는 기법
Bagging vs. Boosting
Bagging vs. Boosting - 병렬화여부의차이 - bagging은 variance 가높은 base estimator 를 - boosting 은 bias가높은 base estimator 를사용 - boosting 은비용이매우높은알고리즘 - bagging은기본적으로데이터의 subset à boosting 보다좋은성능을내기는어려움
sklearn.ensemble.adaboostclassifier
sklearn.ensemble.adaboostregressor
Adaboost - 가장인기있는 boosting 알고리즘 + 높은성능 - 비슷한결과를내는 base-estimator à bagging 가유사함 - Weak classifier 의성능을극적으로향상 à 계속틀리는 instance 를맞추는순간 model 중요도 up - 특정문제에대해서는데이터와기본학습자가중요
Human knowledge belongs to the world.
Gradient boosting Ensemble Model Director of TEAMLAB Sungchul Choi
Gradient Boosting - Adaboost 와같은 boosting 기법의일종 - Regression, Classification 등모두사용 - Sequential + Additive Model - 이전모델의 Residual 를가지고 weak learner 를강화함 - residual 을예측하는형태의모델 - classification은 K-L Divergence 를사용
http://blog.kaggle.com/2017/01/23/a-kaggle-master-explains-gradient-boosting/
http://blog.kaggle.com/2017/01/23/a-kaggle-master-explains-gradient-boosting/
http://blog.kaggle.com/2017/01/23/a-kaggle-master-explains-gradient-boosting/
예측모델의값으로 0 으로초기화, r 은 y 의원래값 Subsampling 또는 Constraints 를가진트리모델로 r 과 X 를 fitting 새로운모델 = 이전모델 + learning_rate * 위에서적합된모델 현재의 residual - learning_rate * 적합된모델
from sklearn.tree import DecisionTreeRegressor tree_reg1 = DecisionTreeRegressor(max_depth=2) tree_reg1.fit(x, y) r1 = y - tree_reg1.predict(x) tree_reg2 = DecisionTreeRegressor(max_depth=2) tree_reg2.fit(x, r1) r2 = r1 - tree_reg2.predict(x) tree_reg3 = DecisionTreeRegressor(max_depth=2) tree_reg3.fit(x, r2) y_pred = sum(tree.predict(x_new) for tree in (tree_reg1, tree_reg2, tree_reg3))
<latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> ŷ (0) i =0 ŷ (1) i = f 1 (x i )=ŷ (0) i + f 1 (x i ) ŷ (2) i = f 1 (x i )+f 2 (x i )=ŷ (1) i + f 2 (x i )... tx ŷ (t) i = k=1 (t 1) f k (x i )=ŷ i + f t (x) M(x, y from sklearn.tree import DecisionTreeRegressor tree_reg1 = DecisionTreeRegressor(max_depth=2) tree_reg1.fit(x, y) r1 = y - tree_reg1.predict(x) tree_reg2 = DecisionTreeRegressor(max_depth=2) tree_reg2.fit(x, r1) tx k=1 f k 1 (x)) = f t (x) r2 = r1 - tree_reg2.predict(x) tree_reg3 = DecisionTreeRegressor(max_depth=2) tree_reg3.fit(x, r2) y_pred = sum(tree.predict(x_new) for tree in (tree_reg1, tree_reg2, tree_reg3))
<latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> (x, y tx k=1 f k 1 (x)) = f t (x) ŷ (t) i = tx k=1 (t 1) f k (x i )=ŷ i + f t (x)
<latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="mohea0psyihayoqooao9sn+xe7i=">aaacvxichvfds8mwfe3rd/2a+uhlccgtdlrduarrfesgdwpob+ucazzqne1lkqql9e+kl/4b03vo3qqvhbzouefemxsvylqq2/4wzlhxicmp6rlrdm5+ybg0thwtw1hg0sahc0xtq5iwyklducvimxiebr4jn97tsa7fpbmhacivvbkrdoduofuprkptndl7bjyvjfvwtpk6uqkpooslhnnkarcl6rawpbumuq61aeu5lupgp6mnfcy3qkjrxga9+8z5vjaoo6chyvxhyg89rfj0+jphc9v/dbostqlsv+1ewfhg9eez9ooiu3pzuygoa8ivzkjklmnhqp3mvtejmexgkkqip6f70tkqo4didtrbfgbxndoffij04qr22j+ofavsjognmwokhuswlpn/aa1y+xvtlpiovotjopefm6hcmh8l7fjbsgkjbgglqmef+ahp5sj94fksnoenj4lrwtwxq87ltvnoul+oabak1kafogaxhiezcaeaabv7xp1bjufz0cqmm3mrahp9zwr4febljy8vzhq=</latexit> <latexit sha1_base64="mohea0psyihayoqooao9sn+xe7i=">aaacvxichvfds8mwfe3rd/2a+uhlccgtdlrduarrfesgdwpob+ucazzqne1lkqql9e+kl/4b03vo3qqvhbzouefemxsvylqq2/4wzlhxicmp6rlrdm5+ybg0thwtw1hg0sahc0xtq5iwyklducvimxiebr4jn97tsa7fpbmhacivvbkrdoduofuprkptndl7bjyvjfvwtpk6uqkpooslhnnkarcl6rawpbumuq61aeu5lupgp6mnfcy3qkjrxga9+8z5vjaoo6chyvxhyg89rfj0+jphc9v/dbostqlsv+1ewfhg9eez9ooiu3pzuygoa8ivzkjklmnhqp3mvtejmexgkkqip6f70tkqo4didtrbfgbxndoffij04qr22j+ofavsjognmwokhuswlpn/aa1y+xvtlpiovotjopefm6hcmh8l7fjbsgkjbgglqmef+ahp5sj94fksnoenj4lrwtwxq87ltvnoul+oabak1kafogaxhiezcaeaabv7xp1bjufz0cqmm3mrahp9zwr4febljy8vzhq=</latexit> <latexit sha1_base64="mohea0psyihayoqooao9sn+xe7i=">aaacvxichvfds8mwfe3rd/2a+uhlccgtdlrduarrfesgdwpob+ucazzqne1lkqql9e+kl/4b03vo3qqvhbzouefemxsvylqq2/4wzlhxicmp6rlrdm5+ybg0thwtw1hg0sahc0xtq5iwyklducvimxiebr4jn97tsa7fpbmhacivvbkrdoduofuprkptndl7bjyvjfvwtpk6uqkpooslhnnkarcl6rawpbumuq61aeu5lupgp6mnfcy3qkjrxga9+8z5vjaoo6chyvxhyg89rfj0+jphc9v/dbostqlsv+1ewfhg9eez9ooiu3pzuygoa8ivzkjklmnhqp3mvtejmexgkkqip6f70tkqo4didtrbfgbxndoffij04qr22j+ofavsjognmwokhuswlpn/aa1y+xvtlpiovotjopefm6hcmh8l7fjbsgkjbgglqmef+ahp5sj94fksnoenj4lrwtwxq87ltvnoul+oabak1kafogaxhiezcaeaabv7xp1bjufz0cqmm3mrahp9zwr4febljy8vzhq=</latexit> <latexit sha1_base64="mohea0psyihayoqooao9sn+xe7i=">aaacvxichvfds8mwfe3rd/2a+uhlccgtdlrduarrfesgdwpob+ucazzqne1lkqql9e+kl/4b03vo3qqvhbzouefemxsvylqq2/4wzlhxicmp6rlrdm5+ybg0thwtw1hg0sahc0xtq5iwyklducvimxiebr4jn97tsa7fpbmhacivvbkrdoduofuprkptndl7bjyvjfvwtpk6uqkpooslhnnkarcl6rawpbumuq61aeu5lupgp6mnfcy3qkjrxga9+8z5vjaoo6chyvxhyg89rfj0+jphc9v/dbostqlsv+1ewfhg9eez9ooiu3pzuygoa8ivzkjklmnhqp3mvtejmexgkkqip6f70tkqo4didtrbfgbxndoffij04qr22j+ofavsjognmwokhuswlpn/aa1y+xvtlpiovotjopefm6hcmh8l7fjbsgkjbgglqmef+ahp5sj94fksnoenj4lrwtwxq87ltvnoul+oabak1kafogaxhiezcaeaabv7xp1bjufz0cqmm3mrahp9zwr4febljy8vzhq=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> M(x, y tx k=1 f k 1 (x)) = f t (x) ŷ (t) i = tx k=1 (t 1) f k (x i )=ŷ i + f t (x) L(y, F(x)) = (y F (x))2 2 J = X L(y, F(x)) @J @F (x) = @ P L(y, F(x)) @F (x) @J y F (x) = @F (x) = F (x) y
<latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="xeocsord3gqj4fxztn59yh5zdxk=">aaace3icbzdlsgmxfiyz9vbhw9wlm2arwrflrgtdfipu3agv7axawjjppg3nxejosmsw7+dgv3hjqhg3btz5nqaxhbb+epjyn3nizu+egiuwrg8jtbs8srqwxjc3nre2dzk7ezuvrjkykg1eibsouuxwn1wbg2cnudlioylvnchvuf5/yflxwl+ducjahun53owugly6mwpzjjc8wancs0vejx6u7oqeskupyce5yt6ps/ogmsxojmsvrynwitgzykkzkp3mv6sb0mhjplbblgravgjtmejgvldebewkhyqosi81nfrey6odt3zk8jf2utgnpd4+4in7eyimnlijz9gdhog+mq+nzf9qzqjci3bm/tac5tppq24kmar4hbducskoijegqixxf8w0tyshogmch2dpr7witdoibrxt27ns+xiwrxodoeouqzy6r2v0jsqoiih6rm/ofb0zt8al8w58tfttxmxmh/2r8fkd92abda==</latexit> <latexit sha1_base64="v/ck/lsb3zovnf3nie5nxj7tldg=">aaackhicjvfbs8mwge3rbu6ncz76ehzkhjhafdydw6kg4pocm4o1ldrnnts9kktikp09/h/f/dem22dqfdwqodnfltmfmzaqpgf8aprc4tlysmm1vlze2disbtx6ik45jj0cs5gpxcqioxhpssozgssconbl5nenror44wvhgsbrgxwlxa7ru0r9ipfukln9u3acxmst7nfgtzmfh2ze3a6gxvqpd0hfcablleeef2u1rvdws6t5nxll5whnvok4pijb23zgzwv5u3uq4ftrrssya84tc0rqyio7p/puetfoqxjjzjaqq9nipj0vezajedlkbukqdtatgsoaozaioxsbmsm9pxjqj7k6kyrj9wtfhkihrqgrmkmkn8xpwch+fhum0m/bgy2svjiitwb5kymyhsv2oec5wzknfegyu/vwij+rmkqqhrymmd+/pe/6ry3tajn3j/xu5dsoetgbu6abthakuuag3ieewfpfo9boti5e09v6ux4xsdw1ac02+ab99hosa7mp</latexit> <latexit sha1_base64="v/ck/lsb3zovnf3nie5nxj7tldg=">aaackhicjvfbs8mwge3rbu6ncz76ehzkhjhafdydw6kg4pocm4o1ldrnnts9kktikp09/h/f/dem22dqfdwqodnfltmfmzaqpgf8aprc4tlysmm1vlze2disbtx6ik45jj0cs5gpxcqioxhpssozgssconbl5nenror44wvhgsbrgxwlxa7ru0r9ipfukln9u3acxmst7nfgtzmfh2ze3a6gxvqpd0hfcablleeef2u1rvdws6t5nxll5whnvok4pijb23zgzwv5u3uq4ftrrssya84tc0rqyio7p/puetfoqxjjzjaqq9nipj0vezajedlkbukqdtatgsoaozaioxsbmsm9pxjqj7k6kyrj9wtfhkihrqgrmkmkn8xpwch+fhum0m/bgy2svjiitwb5kymyhsv2oec5wzknfegyu/vwij+rmkqqhrymmd+/pe/6ry3tajn3j/xu5dsoetgbu6abthakuuag3ieewfpfo9boti5e09v6ux4xsdw1ac02+ab99hosa7mp</latexit> <latexit sha1_base64="v/ck/lsb3zovnf3nie5nxj7tldg=">aaackhicjvfbs8mwge3rbu6ncz76ehzkhjhafdydw6kg4pocm4o1ldrnnts9kktikp09/h/f/dem22dqfdwqodnfltmfmzaqpgf8aprc4tlysmm1vlze2disbtx6ik45jj0cs5gpxcqioxhpssozgssconbl5nenror44wvhgsbrgxwlxa7ru0r9ipfukln9u3acxmst7nfgtzmfh2ze3a6gxvqpd0hfcablleeef2u1rvdws6t5nxll5whnvok4pijb23zgzwv5u3uq4ftrrssya84tc0rqyio7p/puetfoqxjjzjaqq9nipj0vezajedlkbukqdtatgsoaozaioxsbmsm9pxjqj7k6kyrj9wtfhkihrqgrmkmkn8xpwch+fhum0m/bgy2svjiitwb5kymyhsv2oec5wzknfegyu/vwij+rmkqqhrymmd+/pe/6ry3tajn3j/xu5dsoetgbu6abthakuuag3ieewfpfo9boti5e09v6ux4xsdw1ac02+ab99hosa7mp</latexit> <latexit sha1_base64="v/ck/lsb3zovnf3nie5nxj7tldg=">aaackhicjvfbs8mwge3rbu6ncz76ehzkhjhafdydw6kg4pocm4o1ldrnnts9kktikp09/h/f/dem22dqfdwqodnfltmfmzaqpgf8aprc4tlysmm1vlze2disbtx6ik45jj0cs5gpxcqioxhpssozgssconbl5nenror44wvhgsbrgxwlxa7ru0r9ipfukln9u3acxmst7nfgtzmfh2ze3a6gxvqpd0hfcablleeef2u1rvdws6t5nxll5whnvok4pijb23zgzwv5u3uq4ftrrssya84tc0rqyio7p/puetfoqxjjzjaqq9nipj0vezajedlkbukqdtatgsoaozaioxsbmsm9pxjqj7k6kyrj9wtfhkihrqgrmkmkn8xpwch+fhum0m/bgy2svjiitwb5kymyhsv2oec5wzknfegyu/vwij+rmkqqhrymmd+/pe/6ry3tajn3j/xu5dsoetgbu6abthakuuag3ieewfpfo9boti5e09v6ux4xsdw1ac02+ab99hosa7mp</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> <latexit sha1_base64="htgqdnhuyxei49dxfjup/3fn8is=">aaac53icbvlpa9swfjbdbuu8x2l33eu0tcsmbssmtkugbjcdo1jaqpwawzytevk20vnomp4heumhy+y6f2m3/tolcukw1u0dwcf33ve99yrfurqgfp+/425spnn6bou59+llq9dvots7ryyrnonjlslmn0tucckuh4mayu9yzwkasx4clb7w+eofxburqr+wzpk0ptmleseowcrsxaurnwlvgtsmqmboovxwotgte36/wvsj7aebd5cmazojse8sfh08wm3r+9tkszpssevc4qrenygnyfdgzd+imzatn1i7baziw3ixitupwm3iet/4comivbn+wfxcrot5yafrd/xv4ieankclmjgmo//slkxiuqimqtet4ucwlakgwssvvkawpkdsqwd8yqgiktftcvvofd6ztiyttnujak/yu4qspsys08hwphtmpp2rycdykwksz9nsqlwarti6uvjidbmuhx3hqnmgcmkbzvrywtgbu00z2k9rxwjpr/wqha0hxb+q7x+7b1+a69hc79au6igcpqed9a0dojfituyco5fol1e4f+5v98+61huazvt0l9y/1+qn4zc=</latexit> M(x, y tx k=1 f k 1 (x)) = f t (x) ŷ (t) i = tx k=1 (t 1) f k (x i )=ŷ i + f t (x) F k (x) =F k 1 (x)+ f k = F k 1 (x)+ (y F k 1 (x)) = F k 1 (x)+ ( @J @F k 1 (x) ) @J i := i @ i functional gradient descent gradient descent with functions
Turning parameters - number of tree (estimators) - depth of tree - subsampling - shrinkage parameter λ - Fitting to low variance
loss : { ls, lad, huber, quantile }, optional (default= ls ) learning_rate : float, optional (default=0.1) n_estimators : int (default=100) max_depth : integer, optional (default=3) subsample : float, optional (default=1.0)
loss : { deviance, exponential }, optional (default= deviance ) learning_rate : float, optional (default=0.1) n_estimators : int (default=100) max_depth : integer, optional (default=3) subsample : float, optional (default=1.0)
Gradient Boosting - greedy algorithm - scale에강건한모델 - 다양한 loss function을지원 (huber) - Overfitting problem - Slow model & High computation resource
Human knowledge belongs to the world.
XGBoost & LightGBM Ensemble Model Director of TEAMLAB Sungchul Choi
Gradient boosting packages - GBM 연산량과병렬처리를위한패키지가존재 - 대표적인패키지는 XGBoost 와 LightGBM - XGBoost - extreme Gradient Boosting https://github.com/dmlc/xgboost - LightGBM Light Gradient Boosting Machhine https://github.com/microsoft/lightgbm - 구현알고리즘은일부상이하나목표등은유사한패키지
XGBOOST Light GBM https://github.com/dmlc/xgboost/issues/1950 https://www.analyticsvidhya.com/blog/2017/06/which-algorithm-takes-the-crown-light-gbm-vs-xgboost/
XGBoost Light GBM https://towardsdatascience.com/catboost-vs-light-gbm-vs-xgboost-5f93620723db
Instllation 1. Git 설치 (https://git-scm.com/) 2. Cmake 설치 (https://cmake.org/, win) 3. Visual studio 2015 (win) 4. Brew (mac) https://xgboost.readthedocs.io/en/latest/build.html https://lightgbm.readthedocs.io/en/latest/installation-guide.html https://youtu.be/addme/edhdwwmpqr-_ne2eykfb91c8e4opdw
Parameters https://github.com/dmlc/xgboost/blob/master/doc/parameter.md http://lightgbm.readthedocs.io/en/latest/parameters.html http://lightgbm.readthedocs.io/en/latest/parameters-tuning.html https://arxiv.org/abs/1505.01866
Human knowledge belongs to the world.
Stacking Ensemble Model Director of TEAMLAB Sungchul Choi
Stacking - Stacked Generalization https://www.sciencedirect.com/science/article/pii/s0893608005800231 - meta ensemble à 여러모델의결과를묶어서예측 - 컴퓨터성능향상과함께최근대중화되고있음 - 키워드 : stacking kaggle stacknet - 복잡성으로인해완벽한파이썬구현체는아직없음
여러개의모델을생성 Stacking Subset 1 로만학습을함 데이터셋을나눔 http://shop.oreilly.com/product/0636920052289.do
Meta model 각모델의결과값이 Meta 모델의 feature 로학습됨 http://shop.oreilly.com/product/0636920052289.do
Stacking X_train, X_test, y_train, y_test = train_test_split( X, y, test_size=self.test_ratio) for estimator in self.base_estimators: estimator.fit(x_train, y_train) meta_train_set = np.array([estimator.predict(x_test) for estimator in self.base_estimators]).t self.meta_estimator.fit(meta_train_set, y_test)
Stacking def predict(self, X, y=none): meta_x = [] for estimator in self.base_estimators: meta_x.append(estimator.predict(x)) meta_x = np.array(meta_x).t return self.meta_estimator.predict(meta_x)
StackNet - Stacking 지원을위한 JAVA 기반구현체 - XGBoost, LightGBM, Scikit-learn등다양한구현체사용 - 복잡한설치, 그러나 Kaggle 등에서높은성능을보여줌 - Neural net과대비하여더좋은성능, 쉬운학습이가능? https://github.com/kaz-anova/stacknet http://blog.kaggle.com/2017/06/15/stacking-made-easy-an-introduction-tostacknet-by-competitions-grandmaster-marios-michailidis-kazanova/
Human knowledge belongs to the world.