PinnedDatadriftersinGenerative AI100% Open-Source Llama Coding Assistant: Bye, bye GPT-4!All right, I’ve got something really exciting to share with you today!Jan 3022Jan 3022
PinnedDatadriftersOpenAI Assistants API: Walk-through and Coding a Research AssistantI want to show you something exciting today — building OpenAI Assistant for academic research!Nov 8, 20236Nov 8, 20236
PinnedDatadriftersinGoPenAIOpenAI Assistants API A to Z: Practitioner’s Guide to Code Interpreter, Knowledge Retrieval and…There are lots of intricacies when it comes to working with Assistants API and OpenAI-hosted Tools such asNov 13, 20232Nov 13, 20232
DatadriftersLlama 3 Powered Voice Assistant: Integrating Local RAG with Qdrant, Whisper, and LangChainVoice-enabled AI applications will forever change how we interact with technology.May 174May 174
DatadriftersMistral 7b Outperforms GPT-4 in Specialized TasksPredibase recently released 25 fine-tuned LLMs that outperform GPT-4, that can be served on a single GPU.Mar 5Mar 5
DatadriftersFigma: Incredible Ride to $10 Billion MoatWhen Dylan Field and Evan Wallace met in the quiet corridors of Brown University, they knew that they were at the crossroads.Mar 4Mar 4
DatadriftersStarCoder 2: Can Top Open Source LLM Beat GitHub Copilot?With over 1.3 million paid subscribers and a deployment across more than 50,000 organizations, GitHub CoPilot is world’s most widely…Mar 21Mar 21
DatadriftersMistral Large rivals GPT-4 and Claude 2: Setup, Testing, and Function CallingMistral Large is the world’s second-ranked model generally available through an API according to official announcement; have a look at the…Mar 11Mar 11
DatadriftersinGenerative AIDid Google Train Gemma on the Test Set to Outperform Mistral? Overview and Performance ComparisonGoogle released the first set of open-source LLMs called Gemma, which are built from the same research and technology used to create the…Feb 221Feb 221
DatadriftersinGoPenAII tried 100s of MLOps tools, this is the absolute best for production-ready AI deployments!Over the years, I trained and fine-tuned a lot of models, which were all shining bright in my Jupyter notebooks. Every time, I was itching…Feb 211Feb 211