loading words...

Mar 28, 2019 22:46:41

FT13 Boosting

by @hiro | 272 words | 23🔥 | 256💌

Hiro

Current day streak: 23🔥
Total posts: 256💌
Total words: 70380 (281 pages 📄)

Boosting is one of the machine learning approaches or meta-algorithms to make a stronger prediction by using a collective power of weaker predictions. When you train the model, we use a part of the training data to train a weak predictor and learn the mistakes. Based on this mistake, we use the lesson learned to train the next predictor to make it a bit stronger and smarter. Repeating this exercise, we will get which weak predictor has better performance in general. Then when we make the final decision, we will use this weight as balancing the result from weak learners. 

In a way, a group of people such as organizations, companies can be a collection of neural networks i.e., human brains. We make a lot of decisions by reflecting on members' opinions or preference indirectly or directly. In theory, even if each member is not that smart, overall if we have enough people and a number of repetition, we have a higher chance to make a better decision than an average one. 

Based on this meta-algorithm, many powerful algorithms have been generated such as Random Forest and Gradient Boosting.  

Personally, I like this idea to combine the collective opinions and then get a better result. It sounded a power of democracy. One might think society should be led by only smartest people but that works on a certain condition. It might be overfitted to the current or recent condition. To avoid overfitting, we should still need diversity and even "weak" members, which is measured in one aspect but we will not know that this is actually weak or not. 


***

Grammarly: 4

Word of the day: comminute

 

 



contact: email - twitter / Terms / Privacy