New Cohort

Tomorrow begins a new cohort in the part-time data science class I teach at General Assembly. This bootcamp thing is fantastic. Note that these views are mine, not necessarily those of Generaly Assembly.

Because of a bootcamp, I have a career, I have a side-hustle in teaching, and I have a group of top-notch friends. But I also have been thinking about pedagogy.

There are drawbacks to the system, and one of the drawbacks is that given the short amount of time together, we take a path of going wide, rather than deep.

Pros Cons
Students learn concepts that would have taken years to uncover, in a short span The onus is then on students to go deep, and many won’t
By learning more difficult concepts, students pick up the less mentally challenging by osmosis Presenting Material across Several Learning Styles is non-trivial.

I’m going to work through a series of posts describing at a deeper level, number of machine learning models.

I’ll start with Linear Regression, then progress into Decision Trees, then into KNN, then Logistic Regression, Random Forests, and finally, get into Multinomial Naive Bayes.

This post will be edited, so that it serves as a sort of table of contents.

Check back soon!

  • Linear Regression
  • Decision Trees
  • KNN
  • Logistic Regression
  • Random Forests
  • Multinomial Naive Bayes
Written on October 27, 2019