Machine Learning.

Decision Trees Are Usually Better Than Logistic Regression
25 October 2018 | by Tim Bock

Logistic regression is a standard approach to building a predictive model. However, decision trees are an alternative which are clearer and often superior.

Continue reading

Feature Engineering for Categorical Variables
24 October 2018 | by Tim Bock

There are two types of predictors in predictive models: numeric and categorical. There are several methods of transforming categorical variables.

Continue reading

Feature Engineering for Numeric Variables
24 October 2018 | by Tim Bock

When building a predictive model, it is often practical to improve predictive performance by modifying the numeric variables. This is called transformation.

Continue reading

view of a forest
How Random Forests Fit to Data
06 August 2018 | by Jake Hoare

A random forest is a collection of decision trees, which is used to learn patterns in data and make predictions based on those patterns.

Continue reading

tree on a hill
How is Splitting Decided for Decision Trees?
02 August 2018 | by Jake Hoare

Decision trees work by repeatedly splitting the data to lead to the option which causes the greatest improvement. We explain how these splits are chosen.

Continue reading

random forest
How is Variable Importance Calculated for a Random Forest?
30 July 2018 | by Jake Hoare

A random forest is an ensemble of decision trees. Like other machine-learning techniques, they use training data to learn to make predictions. Read more.

Continue reading

Machine learning pruning
Machine Learning: Pruning Decision Trees
04 July 2017 | by Jake Hoare

Machine learning is an issue of trade-offs. Here we look at pruning and other ways of managing these trade-offs in the context of decision trees. Read more.

Continue reading

Cookies help us provide, protect and improve our products and services. By using our website, you agree to our use of cookies (privacy policy).
close-image