[R] SOS Boosting
    Kuhn, Max 
    Max.Kuhn at pfizer.com
       
    Tue Jul 12 18:03:31 CEST 2005
    
    
  
>Hi,
>
>I am trying to implement the Adaboost.M1. algorithm as described in
>"The Elements of Statistical Learning" p.301
>I don't use Dtettling 's library "boost" because :
>  - I don't understande the difference beetween Logitboost and L2boost
>  - I 'd like to use larger trees than stumps.
>
It also doesn't have a predict function, which is why I don't use it much.
>By using option weights set to (1/n, 1/n, ..., 1/n) in rpart or tree
>function, the tree obtained is trivial (just root, no split) whereas
>without weight or for each weight >1,trees are just fine.
>
>So here is my question : how are weights taken into account in optimal
>tree's discovery ?
>Did someone implement boosting algorithm ?
Check out the gbm package. It is fairly close to MART in the reference 
you mentioned. To get to adaboost with stumps, you should look at the 
arguments 
  distribution = "adaboost"
  interaction.depth =  1
To get more information see ?gbm (if you have it installed) or the 
file gbm.pdf in the doc directory of the library.
Max
>
>Regards,
>
>Olivier Celhay   -  Student  -  Paris, France
LEGAL NOTICE\ Unless expressly stated otherwise, this messag...{{dropped}}
    
    
More information about the R-help
mailing list