PinnedJared ZoneraichScalable Prompt Management and CollaborationPrompts are the magic that makes your LLM system work. They are your secret sauce 🥫 Make sure they are organized & versioned in a CMS.May 28May 28
DanA How-To Guide On Fine-TuningFine-tuning is an extremely powerful prompt engineering technique. This how-to guide will show you exactly how to do it effectively.Aug 18Aug 18
DanPrompt Templates with Jinja2Jinja2 is a powerful templating engine that can take your prompts to the next level. See how it’s more powerful than just f-string.Aug 16Aug 16
Jared ZoneraichPrompt Engineering with Anthropic ClaudeTips on how to prompt Claude more effectively. Take-aways from a talk by Anthropic’s “Prompt Doctor” (Zack Witten).Jul 31Jul 31
Jared ZoneraichYou should be A/B testing your prompts.Ground truth is subjective, and the only reliable way to evaluate prompts is with real user metrics. A/B testing helps you safely iterate.Jul 26Jul 26
Jared ZoneraichTool Calling with LLMs: How and when to use it?LLM tool calling as an AI idiom, its benefits over JSON mode, and examples of how to use function calling in your real projects.Jul 22Jul 22
Pranav KanchiWhy Fine-Tuning is (Probably) Not for YouFor some reason, it feels like every startup now has its own custom-trained model. This is probably not a good idea.Jun 3Jun 3
Jared ZoneraichSpeeding up iteration with PromptLayer’s CMS ips for prompt management)This post was cross-posted with permission from Greg Baugues. You can find the original at https://www.haihai.ai/friction/May 29May 29
Jared ZoneraichGorgias Uses PromptLayer to Automate Customer Support at ScaleGorgias uses PromptLayer every day to store and version control prompts, run evals on regression and backtest datasets, and review logs.May 23May 23