Become a member
Sign in
Ma Da
Ma Da

Ma Da

12 Following
1 Followers
  • Profile
  • Claps
  • Highlights

Highlighted by Ma Da

See more

From New State of the Art AI Optimizer: Rectified Adam (RAdam). by Less Wright

Hence, warmup (an initial period of training with a much lower learning rate) is a requirement for adaptive optimizers to offset excessive variance when the optimizer has only worked with limited training data.

From Machine Learning 1: Lesson 2 by Hiromi Suenaga

How to determine if it is better? Take weighted average of two new nodes

From Machine Learning 1: Lesson 2 by Hiromi Suenaga

R² is the ratio between how good your model is (RMSE)vs. how good is the naïve mean model (RMSE).

Claps from Ma Da

See more

What You Need to Know About Netflix’s ‘Jupyter Killer’: Polynote 📖

Michael Li

Implementing callbacks in fast.ai

Edward Easling

New State of the Art AI Optimizer: Rectified Adam (RAdam).

Less Wright