Articles on Binary Classification
Welcome to the third article in our Machine Learning with Ruby series!
In our previous article Machine Learning: An Introduction to CART Decision Trees in Ruby, we covered CART decision trees and built a simple tree of our own.
We then looked into our first ensemble model technique, Random Forests, in Machine Learning: An Introduction to Random Forests. It is a good idea to review that
article before diving into this one.
Random Forests are great for a wide variety of cases, but there are also situations where they don’t perform quite as well. In this article we’ll take a look at
another popular tree-based ensemble model: Gradient Boosting.
Read more
In our previous article Machine Learning: An Introduction to CART Decision Trees in Ruby, we covered CART decision trees and built a simple tree of our own.
Decision trees are very flexible and are a good tool for simple classification, but they are often not enough when it comes to real-world scenarios.
When dealing with large and complex data, or when dealing with data with a significant amount of noise, we need something more powerful.
That’s where ensemble models come into play. Ensemble models combine a number of weak learners to build a strong model, with increased accuracy and robustness.
Ensembles also help manage and reduce bias and overfitting.
In this article, we’ll cover a very popular tree-based ensemble model: Random Forest.
Read more