Blog de Kev C

← Back to blog

Published on 10/27/2019 14:25 by Kevin Coyle

One of my favorite things about teaching is that I get to work with a group of people who are all at different levels of their learning journey. Because the field is still so young, this means that most of the courses I teach are at a fairly basic level from either a software engineering or from a mathy/data science perspective.

Because of a bootcamp, I have a career, I have a side-hustle in teaching, and I have a group of top-notch friends. But I also have been thinking about pedagogy.

There are drawbacks to the system, and one of the drawbacks is that given the short amount of time together, we take a path of going wide, rather than deep.

ProsCons
Students learn concepts that would have taken years to uncover, in a short spanThe onus is then on students to go deep, and many won’t
By learning more difficult concepts, students pick up the less mentally challenging by osmosisPresenting Material across Several Learning Styles is non-trivial.

I’m going to work through a series of posts describing at a deeper level, number of machine learning models.

I’ll start with Linear Regression, then progress into Decision Trees, then into KNN, then Logistic Regression, Random Forests, and finally, get into Multinomial Naive Bayes.

This post will be edited, so that it serves as a sort of table of contents.

Check back soon!

Written by Kevin Coyle

← Back to blog