Skip to content

jsutch/Decision_Trees_and_Random_Forests

Repository files navigation

Decision Trees and Random Forests

Examples of Decision Teee and Random Forests with practical examples.

Decision Trees are a type of model used for both Classification and Regression

Trees answer sequential questions which send us down a certain route of the tree given the answer. The model behaves with “if this than that” conditions ultimately yielding a specific result.

A Random Forest is simply a collection of decision trees whose results are aggregated into one final result.

Their ability to limit overfitting without substantially increasing error due to bias. It's a strong modeling technique and much more robust than a single decision tree. They aggregate many decision trees to limit overfitting as well as error due to bias and therefore yield useful results.

  • Analyzing data of the spinal condition Kyphosis with scikit-learn
  • Credit Data Analysis This uses LendingClub.com lending data from 2007-2010 and be trying to classify and predict whether or not the borrower paid back their loan in full

The jupyter notebooks can be opened in Github by clicking on the notebook file. Might need a refresh to load depending on your browser.