Skip to main content

Posts

Showing posts from February, 2014

Random Forest Almighty

Random Forestsare awesome. They do not overfit, they are easy to tune, they tell you about important variables, they can be used for classification and regression, they are implemented in many programming languages and they are faster than their competitors (neural nets, boosting, support vector machines, ...)

Let us take a moment to appreciate them:




The Random Forest™ is my shepherd; I shall not want.      He makes me watch the mean squared error decrease rapidly. He leads me beside classification problems.     He restores my soul. He leads me in paths of the power of ensembles     for his name's sake. 
Even though I walk through the valley of the curse of dimensionality,     I will fear no overfitting, for you are with me;     your bootstrap and your randomness,     they comfort me.
 You prepare a prediction before me     in the presence of complex interactions; you anoint me data scientist;     my wallet overflows.  Surely goodness of fit and money shall follow me     all the days of my life, an…