NYU Center for Data ScienceMaximum Manifold Capacity Representations: A Step Forward in Self-Supervised LearningCDS researchers advance Maximum Manifold Capacity Representations (MMCR), a new self-supervised learning method.Sep 13
The One Alternative ViewinILLUMINATIONExperts Think Cities Don’t Behave Like Organisms, But I Am of a Different OpinionCities can be considered organismsJun 519
Yu-Cheng TsaiinSage AiDemystify Transformers: A Guide to Scaling LawsUnpacking Transformer Technologies and Scaling StrategiesApr 301Apr 301
Mohammad Reza EsmaeiliyanModel o1 and Introducing a New Paradigm for Inference ScalingRevolutionizing LLM Efficiency; Paradigm Shifts in Model Inference and ScalabilitySep 13Sep 13
LM PoThe Evolution of Scaling Laws for LLMsThis article reviews the evolution of neural scaling laws for large language models (LLMs), from OpenAI’s foundational work (2020) to…Aug 5Aug 5
NYU Center for Data ScienceMaximum Manifold Capacity Representations: A Step Forward in Self-Supervised LearningCDS researchers advance Maximum Manifold Capacity Representations (MMCR), a new self-supervised learning method.Sep 13
The One Alternative ViewinILLUMINATIONExperts Think Cities Don’t Behave Like Organisms, But I Am of a Different OpinionCities can be considered organismsJun 519
Yu-Cheng TsaiinSage AiDemystify Transformers: A Guide to Scaling LawsUnpacking Transformer Technologies and Scaling StrategiesApr 301
Mohammad Reza EsmaeiliyanModel o1 and Introducing a New Paradigm for Inference ScalingRevolutionizing LLM Efficiency; Paradigm Shifts in Model Inference and ScalabilitySep 13
LM PoThe Evolution of Scaling Laws for LLMsThis article reviews the evolution of neural scaling laws for large language models (LLMs), from OpenAI’s foundational work (2020) to…Aug 5
Ruslan KarmannyyEric Schmidt’s Controversial Interview: How GenAI is Shaping Our Future and How to Adapt.You might be sceptical about the topic, but if you’re following the rapidly evolving landscape of Generative AI (GenAI), you’ll want to…Sep 3
pamperherselfFinally Understanding Gradient Descent and Scaling Laws OpenAI Validated It 4 Years AgoRecently, I came across a 2020 paper by OpenAI titled Scaling Laws for Neural Language Models. After reading it, I finally understood the…Aug 15