Posts

Random Forest in Bytes

  Photo by  Jens Lelie  on  Unsplash This is the second post in In Bytes series. Refer this link to read about the first post. Random Forest: Random forest is an ensemble made by the combination of a large number of decision trees. Ensemble: An ensemble means a group of things viewed as a whole rather than individually. In ensembles, a collection of models is used to make predictions, rather than individual models. In principle, ensembles can be made by combining all types of models. An ensemble can have a logistic regression, a neural network, and few decision trees working in unison. While choosing the model , we need to check for two things- Diversity and Acceptability. Diversity ensures that the models serve complementary purposes, which means that the individual models make predictions independent of each other. The advantages of this are different depending on the type of ensemble. Diversity ensures that even if some trees overfit, the other trees in the ensemb...

Principal Component Analysis

A Comparative Technical Analysis of Ensemble Learning: Bootstrap Aggregating vs. Gradient Boosting

Random Forest: Theory, Construction, and Optimization

Random Forest : In Slides

Popular Posts

Random Forest: Theory, Construction, and Optimization

Random Forest : In Slides

Basics of Machine Learning -Part I

Model Selection and Probably Approximately Correct (PAC) Learning

Performing Exploratory Data Analysis (EDA) using Python to predict Loan Defaulters

Decision Trees In Bytes

Churn Score : Predict the customer churn using Machine Learning

Advanced Cost Function Optimization Techniques

Advanced Regression: Modeling Non-Linearity and Complexity