Launching a Deep learning for NLP study group

A remote study group to Stanford’s CS224d “Deep learning for NLP” class

Julien Chaumond
HuggingFace
2 min readOct 4, 2016

--

The application of Deep learning to Natural langage processing, in particular to text understanding, is the next frontier of AI: deep learning is now proven and mature enough for other modes such as computer vision and speech recognition — for text, it’s just getting started.

My friend Thomas Wolf and I are starting a study group to collectively follow Richard Socher’s CS224d class at Stanford (“Deep learning for NLP”), which was last offered in the spring quarter this year.

The goal is to assemble a group of 6 to 10 people who are passionate about the subject enough to commit to devote 6-8 hours per week over 4 months, to study the class material and programming assignments (see syllabus below) — the idea being that you will study more efficiently as part of a group than you will by yourself.

A sample of the cutting-edge papers that we will read in the course

Prerequisites are: intro to machine learning (Coursera course or equivalent), fluency in Python, and 6–8 hours of available time per week.

We will work remotely, via a Facebook group, however we might have a few physical get-together meetups in Paris and/or NYC.

We also might have a few featured, recognized outside lecturers (teaser…🤓).

Syllabus

We aim to stick rather closely to the CS224d syllabus, meaning that we will start with some theory (word vectors, neural nets, RNNs, LSTMs, CNNs) but also dive pretty deep into implementation. One of the goals is to build and ship public prototypes implemented in this field — it’s a great topic to ship PoCs and experiments on–, so bonus points if you have an entrepreneurial spirit.

Tentative, high-level schedule:

  • October 24: Study group starts, logistics.
  • November: Word vectors and single layer neural networks (roughly Lectures 1–3 of CS224d)
  • December: Advanced neural nets, TensorFlow. (roughly Lectures 4–5)
  • January: RNNs, CNNs, LSTMs, GRU(Lectures 5–8)
  • February: seq2seq, Dynamic memory networks, Generative models and Final projects.

Deadline for applying is October 15.

If you’re interested or want to know more, send me an email — we’re going to have a great time!

[UPDATE] We are over-capacity already, thank you everyone! I should have answered everyone already, please let me know if that’s not the case. Thank you!

--

--

Julien Chaumond
HuggingFace

Internet Entrepreneur, co-founder at Hugging Face (🤗). Hacker, singularitarian and neopagan; Follow me at @julien_c