Machine Learning: From Regression to Neural Networks

Learn regression analysis, supervised learning, KNN, random forest classifiers, and neural networks across five lessons.

Master essential machine learning techniques, from basic regression analysis to neural networks, as you develop practical skills in modeling and prediction. Gain hands-on experience through real-world examples such as predicting Titanic survivors using random forest classifiers.

Key Insights

  • Conduct regression analysis by preparing data and applying statistical methods to solve prediction problems, laying a foundational understanding of supervised learning.
  • Apply the k-nearest neighbors (KNN) algorithm through majority voting, visualization, classification training, and real-world data scenarios, measuring performance with appropriate metrics.
  • Utilize neural networks for image identification by understanding the underlying data, training and compiling models, preventing overfitting, and effectively evaluating results.

Note: These materials offer prospective students a preview of how our classes are structured. Students enrolled in this course will receive access to the full set of materials, including video lectures, project-based assignments, and instructor feedback.

Hello, my name is Colin. I'm an instructor at Noble Desktop. Let's talk about what we're going to cover in this course.

In lesson one, we'll perform a basic regression analysis. We'll learn how to get all our files set up, the statistics you'll need to get started, and solve some regression prediction problems. Then, in lesson two, we’ll cover supervised learning essentials.

We'll talk about data cleaning and feature selection, how to use machine learning to model systems, and how to evaluate those models. In lesson three, we’ll focus on the K-nearest neighbors algorithm, how to do majority voting, how to visualize our classification, training and prediction with KNN, applying that algorithm to some real-world data, and how to evaluate those classification systems using metrics. In lesson four, we’ll use random forest classifiers to predict the Titanic.

Spoiler alert: it sinks. We'll be visualizing our data for further data analysis. We'll be training a random forest classifier.

And we will be using that random forest classifier to get some predictions—not whether the Titanic sinks, but who survives. And we'll be submitting those results online. We’ll close out with lesson five, where we’ll identify images using neural networks.

Data Analytics Certificate: Live & Hands-on, In NYC or Online, 0% Financing, 1-on-1 Mentoring, Free Retake, Job Prep. Named a Top Bootcamp by Forbes, Fortune, & Time Out. Noble Desktop. Learn More.

We'll introduce neural networks, discuss the data underlying the images we'll be looking at, how to build and train a neural network, how to avoid overfitting, and how to evaluate our neural network. I hope you enjoy taking this course as much as I always enjoy teaching it, and I'll see you folks in the next video.

Colin Jaffe

Colin Jaffe is a programmer, writer, and teacher with a passion for creative code, customizable computing environments, and simple puns. He loves teaching code, from the fundamentals of algorithmic thinking to the business logic and user flow of application building—he particularly enjoys teaching JavaScript, Python, API design, and front-end frameworks.

Colin has taught code to a diverse group of students since learning to code himself, including young men of color at All-Star Code, elementary school kids at The Coding Space, and marginalized groups at Pursuit. He also works as an instructor for Noble Desktop, where he teaches classes in the Full-Stack Web Development Certificate and the Data Science & AI Certificate.

Colin lives in Brooklyn with his wife, two kids, and many intricate board games.

More articles by Colin Jaffe

How to Learn Machine Learning

Master Machine Learning with Hands-on Training. Use Python to Make, Modify, and Test Your Own Machine Learning Models.

Yelp Facebook LinkedIn YouTube Twitter Instagram