City College, Fall 2018

Intro to Data Science

Week 9: Trees and the Variance-Bias Tradeoff

November 5, 2018

Today's Agenda
  1. Midterm Review
  2. Classification Review
  3. Linear vs. Nonlinear Classification
  4. Decision Trees
Semester Recap
  1. Loading and Transforming Data
  2. Exploratory Data Analysis
  3. Linear Models for Regression and Classification
Midterm Results
Multiple Choice Short Answer Exam Total
Points Possible 40 36 76
Mean 30 31 61
Median 30 34 63
Std Dev 5.4 5.3 10.0

Answer key available on the course page.
Which of these is a random sample?
Which of these has heteroskedastic errors?
Data Science Models

This week's Economist has a really cool example of a classification model.

Linear vs Nonlinear Classification Models

Linear models are most effective when the outcome can be modeled with a combination of coefficients of explanatory variables and the data is linearly separable.

Nonlinear models are better suited for outcomes which rely on interactions between different explanatory variables.

Linearly Separable Data
Linearly Inseparable Data
Titanic Disaster

2224 passengers, 710 survivors

Titanic Prediction Data Set
  1. Training set: 891 observations, 38 percent survival rate
  2. Features include: age, sex, socio-economic class, embarkation point, fare paid
  3. Rich potential for additional feature engineering
  4. Prediction goal: who survives?
Survival Rates Among Subgroups
Survival Rates Among Subgroups

Branching a tree relies on a greedy heuristic, typically entropy.

Computers Make Branching Easy
Bias-Variance Tradeoff

An ideal model that both accurately captures the regularities in its training data, but also generalizes well to unseen data. Unfortunately, it is typically impossible to do both simultaneously.

How do we prevent unnecessary splits?
Hyperparameters: values set before the learning process to avoid overfitting.
Common decision tree hyperparameters:
  1. max_depth
  2. min_samples_split
  3. min_samples_leaf
  4. max_features
How do we choose hyperparameters?
Cross Validation

See here for a slightly deeper discussion.
Let's make our own trees.

Assignment 6: Due Monday, November 12 by 6:30pm

DataCamp's Machine Learning with Tree-Based Models in Python

  • The course should appear collectively as assignment within your existing DataCamp account.
  • Each section will appear separately and will be worth oen point toward the total grade for the homework.
  • Course claims to take 5 hours, but I found it shorter than some of the past courses. Nonetheless, use your time wisely.