PinnedPublished inData Science CollectiveModel Context Protocol (MCP): The Force AwakensDiscover how Model Context Protocol (MCP) enables AI models to seamlessly integrate with applications, tools, and data sources!Mar 282Mar 282
PinnedPublished inGoPenAIExplaining ‘Transformers’ and ‘Attention Is All You Need’ in an InterviewLearn how to explain Transformers in AI for interviews, including key concepts from ‘Attention Is All You Need’ in a clear and concise way.Mar 131Mar 131
PinnedPublished in𝐀𝐈 𝐦𝐨𝐧𝐤𝐬.𝐢𝐨Roadmap to Learn Natural Language Processing in 2023Learning Natural Language Processing (NLP) can be a rewarding journey, but it can also be complex due to its multidisciplinary nature…Sep 2, 2023Sep 2, 2023
The Dark Side of AI: A Reality Many Don’t AcknowledgeThe tech world has always been presented as a land of opportunity, especially for data scientists, machine learning engineers, and AI…11h ago11h ago
Published inData Science Collective15 Common Mistakes in Writing High-Performance Python Code (And How to Fix Them)In software development, even small improvements in code can lead to significant gains in performance, readability, and maintainability —…Mar 26Mar 26
Published inData Science CollectiveRevisiting BERT: Google’s NLP SupermodelBERT: The NLP Superhero! Learn how it reads like a human, powers Google Search, chatbots & more. Your ultimate BERT revisit!Mar 22Mar 22
Published inData Science CollectiveUnderstanding ANNOY: “The Approximate Nearest Neighbors Oh Yeah!” by SpotifySpeed up similarity search with ANNOY! Discover how this tree-based algorithm finds nearest neighbors in milliseconds.Mar 182Mar 182
Published in𝐀𝐈 𝐦𝐨𝐧𝐤𝐬.𝐢𝐨LLM: Understanding GPT-2 — Generative Pre-Trained TransformerA deep dive into GPT-2 — how it works, how it improved over GPT-1, and how it can be fine-tuned for various real-world applications.Mar 16Mar 16
Published inData Science CollectiveLLM: Understanding GPT-1 — Generative Pre-Trained TransformerExplore the architecture of the first GPT model, how it was built, and key elements that shaped the foundation of modern LLMs.Mar 15Mar 15
Published in𝐀𝐈 𝐦𝐨𝐧𝐤𝐬.𝐢𝐨Byte Pair Encoding (BPE): The Secret Behind Subword TokenizationByte Pair Encoding (BPE) is a simple yet powerful data compression and tokenization technique widely used in natural language processing…Mar 15Mar 15