Become a member
Sign in
Tian Hao
Tian Hao

Tian Hao

71 Following
2 Followers
  • Profile
  • Claps
  • Highlights
  • Responses

Highlighted by Tian Hao

See more

From Introduction to gradient boosting on decision trees with Catboost by Daniel Chepenko

…which features to choose and what conditions to use for splitting, along with knowing when to stop. Decision tree tend to be very complex and overfitted — which means, the error of training set will be low, but high on the validation set. A smaller tree with fewer splits might lead to lower variance and better interpretation at the cost…

From Introduction to gradient boosting on decision trees with Catboost by Daniel Chepenko

…ke a binary split we use different metrics — the most popular one are Gini index and Cross-entropy. Gini index is a measure of total variance across K classes. In regression problem we use variance or mean deviation from median

From Bayesian Neural Network Series Post 1: Need for Bayesian Networks by Kumar Shridhar

…ective, it is unjustifiable to use single point-estimates as weights to base any classification on.
Bayesian neural networks, on the other hand, are more robust to over-fitting, and can easily learn from small datasets. The Bayesian approach further offers uncertainty estimates via its parameters in form of probabilit…

Claps from Tian Hao

See more

Explained: A Style-Based Generator Architecture for GANs - Generating and Tuning Realistic…

Rani Horev

GAN — CycleGAN (Playing magic with pictures)

Jonathan Hui

Illustrated: Efficient Neural Architecture Search

Raimi Karim