The International Conference on Learning Representations is already here and it’s packed with content: 860 papers 8 workshops and 8 invited talks. Choosing where to pay attention is hard, so here are some ideas on what’s worth looking at!

Image by author.

A year ago, the ICLR 2020 conference was the first to go fully online, and it set a surprisingly high standard for all fully virtual conferences. This year, the conference is again an online-only event, and it’s looking very promising: Transformers appear less often in titles… because they’re already everywhere! Computer Vision, Natural Language Processing, Information Retrieval, ML theory, Reinforcement Learning… you name it! The variety of content on this year’s edition is jaw dropping.

When it comes to invited talks, the lineup is also exciting: Timnit Gebru will be opening the ceremony talking about how we can…


A monthly selection of ML papers

Image by author.

Staying on top of your reading list is hard, and finding which papers should be on that list can be even harder. At Zeta Alpha we’re always keeping a close eye to the latest ML research, so we’re sharing a monthly selection of recent papers to surface what we believe will be impactful publications, mostly based on each work’s contributions and the authors’ influence. Don’t take this list as comprehensive: we have our biases like everyone else, but hey there’s only so much you can choose out of 4000+ papers. Enjoy!

1. All NLP Tasks Are Generation Tasks: A General Pretraining Framework | 👾 Code


A monthly selection of ML papers.

Staying on top of your reading list is hard, and finding which papers should be on that list can be even harder. At Zeta Alpha we’re always keeping a close eye to the latest ML research, so we’re sharing a monthly selection of recent papers to surface what we believe will be impactful publications, mostly based on each work’s contributions and the authors’ influence. Don’t take this list as comprehensive: we have our biases like everyone else, but hey there’s only so much you can choose out of 4000+ papers. Enjoy!

TransGAN: Two Transformers Can Make One Strong GAN | 👾 Code

🎖Why…


A monthly selection of ML papers.

Staying on top of your reading list is hard, and finding which papers should be on that reading list can be even harder. At Zeta Alpha we’re always keeping a close eye to the latest ML research, so we thought it would be useful to share a monthly selection of recent papers to surface what we believe will be impactful publications, mostly based on each work’s contributions and the authors’ reputation. Don’t take this list as comprehensive: we have our biases like everyone else, but hey there’s only so much you can choose out of 2000+ papers. Enjoy!

1. Learning Transferable Visual Models from Natural Language Supervision (OpenAI CLIP) | ✍️ Blog Post | 👾 Code


When we look back at 2020, what are the key developments in the field of AI and ML? What are the trends that will carry forward into next year? Distilling the work of an entire field in a blog post is straight up impossible, but we think some pieces will stick with us.

Finally, indeed: this crazy year is about to end. Despite all pandemic setbacks, AI is one of the lucky fields where most work can be done from anywhere with a computer and an internet connection. We can see how research output from this year has remained strong, although we’ve seen a linear increase since 2018, instead of the previous exponential growth.


1899 papers, 20k+ attendees, 62 workshops, 7 invited talks. Choosing what to pay attention to is key in such a dense landscape, so here are some ideas on where you should be looking at.

Vancouver, Canada. Photo by Mike Benna on Unsplash

The Conference in Neural Information Processing Systems is always exciting because it serves as a collection of the best the field has offered in the preceeding year. Despite going fully virtual for the first time, this year is no different; I mean, look at the top 25 already cited papers published👇

Making sense of this impressive lineup is no easy feat, but with some help from the AI Research Navigator at Zeta Alpha, we went through the most relevant NeurIPS papers by citations, spotlight presentations and some recommendations from the platform and we identified some really cool works we’d…


This year’s conference from the Association for Computational Linguistics comes packed with more than 700 publications. To make thing easier for you to navigate, here’s a selection of papers with new refreshing datasets and benchmarks for language tasks.

Photo of Seattle by Zhifei Zhou on Unsplash

Datasets and benchmarks are at the core of progress in Natural Language Understanding (NLU): in leaderboard-driven research, progress is upper-bounded by the of our evaluations. While datasets for Machine Learning used to last — i.e. MNIST didn’t reach human performance until more than a decade after it was introduced — the latest benchmarks for Natural Language understanding are becoming obsolete faster than we expected, highlighting the importance of finding better ones.

The sheer number of papers about the topic is quite astounding, so at Zeta-Alpha we have curated this selection of the most interesting works about it at ACL2020…


Transformers are attention-based neural architectures that propelled the field of NLP to new highs after their introduction. The International Conference on Learning Representations has a healthy dose of them, so here’s a curated collection of related publications that will help you navigate them.

The International Conference on Learning Representations (ICLR) is one of the most beloved stages for the Machine Learning community. Nowadays, conferences in the field often serve as a quality trademark and a spotlight for publications that already exist in pre-print servers. Still, the volume of work presented is growingly overwhelming, which makes it hard to keep up.

At Zeta Alpha, we keep a close eye at the forefront of Natural Language Processing (NLP) and Information Retrieval (IR) research. In this spirit, and with the help of our semantic search engine, we’ve curated a selection of 9 papers — out of…


Transformers in 2020.

Photo by Arseny Togulev on Unsplash

2019 was the year of BERT and much has been written about it. Truth be told, it’s hard to overestimate the impact Transformers have had in the NLP community: LSTMs now sound old-fashioned (or do they?²), state-of-the-art papers have been coming steadily along 2019 and, at Google, BERT made it into production in record-breaking time. All of the above while enabling Transfer Learning, which is now the coolest kid in NLP-town.

The development around these models has been remarkable so far, but could Transformers have peaked already? What areas of research should we be looking at most closely? What’s still…


“Welcome to the desert of the multiple configuration files” — Morpheus on a bad day

The CodeceptJS & Puppeteer combo is a simple yet powerful tool to automate web testing. Actions and configurations are easy to program but, surprisingly, the possibilities arisen are endless.

As you might know, it’s quite straightforward to set up several variables of the helper inside the codecept.json file, such as userAgent, windowSize or base url among others. However, what happens if we want to change certain fields? Of course we can change those in codecept.json, but if someone less familiar…

Sergi Castella i Sapé

Working on search technology at Zeta Alpha. Interested in NLP, Transformers, DL & Data Science | Linkedin: linkedin.com/in/sergicastella/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store