PinnedRawan AlkurdMastering DSPy: Elevate Your LLM Model Performance with Modular OptimizationImagine a world where interacting with large language models (LLMs) is as intuitive and efficient as programming with high-level…Jul 8Jul 8
Rawan AlkurdRunning LLMs Locally: A Guide to Setting Up Ollama with DockerIn this blog, we will delve into setting up and running a language model using Ollama locally with Docker. Ollama provides a robust…Jul 1Jul 1