Your first A.I. API endpoint with 🦜LangServe

Rick Garcia
6 min readFeb 6, 2024

Deploying your first A.I. Rest API Endpoint with LangServe is EASY! We’ll walk through everything to get your first LangServe project online.

LangChain has been at the forefront of providing developers with tools that enhance productivity and open up new avenues for AI integration.

LangServe API Endpoint Deployment

The latest addition to their ecosystem, LangServe, is no exception. Designed to simplify the deployment of LangChain runnables and chains as REST APIs, LangServe is designed for developers leveraging AI in their applications. This guide will walk you through setting up your first LangServe project, making the process as simple as possible.

Introducing LangServe

Understanding LangServe

Before we dive into the setup process, let’s take a moment to understand what LangServe brings to the table:

  • Integration with FastAPI: At its core, LangServe leverages FastAPI to offer a robust and speedy API development experience. FastAPI’s intuitive design pairs perfectly with LangServe’s goals.
  • Data Validation with Pydantic: Data integrity is paramount, and LangServe uses Pydantic for data validation, ensuring that your inputs and outputs are exactly as expected.
  • LangChainJS: For those who prefer working with JavaScript, LangServe hasn’t left you out. The LangChainJS client allows seamless interaction with LangServe endpoints.

Key Features

  • Automatic schema inference, robust API documentation, and efficient handling of concurrent requests are just the start.
  • The /stream_log/ endpoint and astream_events feature cater to those who need real-time data streaming and logging.
  • Integration with LangSmith for tracing, alongside the support of powerful Python libraries, makes LangServe not just powerful but also incredibly versatile.

Preparing for Your First LangServe Project

Getting started with LangServe is straightforward. Here’s what you need:

  • Installation Requirements: Ensure you have Python (I used 3.11) and pip installed. For Mac users, brew will be needed to install gh.

Setting Up the Environment

  1. Install the langchain-cli tool with pip:
pip install -U langchain-cli

2. Install gh using brew for easy GitHub integration:

brew install gh

3. Create a new LangServe project using the example pirate-speak template:

langchain app new langserve-demo --package pirate-speak

Configuring Your LangServe Project

Once your project is created, you’ll need to make some tweaks to get it running:

  1. Edit Your Project: Open app/server.py in VSCode. Replace the placeholder add_routes(app, NotImplemented) with the output code provided during the template installation:
from pirate_speak.chain import chain as pirate_speak_chain

add_routes(app, pirate_speak_chain, path="/pirate-speak")

2. Test Locally: Run langchain serve from the app root directory and visit http://127.0.0.1:8000 to see your project live.

Publishing Your Project to Hosted LangServe

Ready to share your project with the world? Here’s how:

  1. Initial Setup on GitHub: Initialize your git repository, commit your changes:
git init
git add .
git commit -m "Initial Commit"

2. Push the changes to GitHub using gh.

gh repo create

#- push an existing local repository to github
#- path to local repo (.)
#- Name the repo: langserve-demo
#- set repo owner
#- Description
#- Pub/Private
#- Add a remote (Y)
#- What should the new remote be called? 'origin'
#- Would you like to push the commits from the current branch to origin? (y)

3. View Your Repository: Confirm everything’s in place by viewing your repository online with gh.

gh repo view --web 

Final Step: Deploying your LangServe Project

  1. LangServe Access: Head over to your LangSmith Beta Account and find the LangServe Alpha section for “Deployments” in the left-hand navigation pane. If you do not have access to LangSmith or LangServe and want to get started now, DM me.
Create new LangServe deployment interface

2. Deploy Your Project: Connect your GitHub account

Get started by clicking the “New Deployment” button in the upper right corner of your browser.

First, you’ll need to link your GitHub account. Follow the onscreen instructions to link your existing GitHub account with LangSmith to allow access to all repositories for easier deployment.

Once GitHub is linked, search for the repo name we created earlier, “langserve-demo” and select the repo.

Next, name your LangServe deployment. Use lowercase letters and (-) only for naming conventions.

In the next field, you’ll select your subdirectory. For this example deployment, leave this as the default (.).

For the next field, “Git reference”, leave this as the default which is populated by your repo.

Lastly you’ll need to create an environment variable that contains your OpenAI API Key.

3. In the “name” field, enter exactly: OPENAI_API_KEY

4. In the “value” field, enter your OpenAI API Key from your OpenAI Developer account control panel.

The final step is to click “Submit” in the upper right-hand corner to deploy your new LangServe FastAPI API Endpoint.

Deployment can take a little bit of time. Expect 4–7 minutes for deployment to complete at the time of this writing.

Exploring the LangServe API

Once your LangServe project is up and running, two powerful tools at your disposal will significantly enhance your development experience: the LangServe Playground and FastAPI Documentation. These tools are not just about testing; they’re about exploring the potential of your AI endpoint and understanding its capabilities in depth.

LangServe Playground

The LangServe Playground is a feature designed to let developers experiment with their deployed AI endpoints. It provides a user-friendly interface for sending prompts to your API and viewing the responses in real-time. This immediate feedback loop is invaluable for prompt engineering, allowing for rapid iteration and refinement of your AI models.

LangServe Playground
  • Accessing the Playground: To dive into the Playground, navigate to the LangSmith Deployments Projects listing. Find your deployed project, and append your LangServe project name followed by /playground to the link provided. For instance:
https://langserve-demo-9722-ffoprvkqsa-uc.a.run.app/pirate-speak/playground/

This URL will take you directly to the Playground of your pirate-speak endpoint, where you can start testing your prompts and seeing how the AI responds. The Playground is an excellent tool for both beginners and experienced developers to experiment with different inputs and fine-tune their AI's behavior.

FastAPI Documentation

LangServe leverages FastAPI to offer comprehensive and interactive API documentation automatically. This documentation is easily accessible and provides a clear overview of your API’s capabilities, request formats, and response structures. It’s a valuable resource for developers looking to integrate their LangServe API endpoints into applications or for those who wish to understand the technical details of their deployed services.

LangServe FastAPI Docs
  • Accessing the API Docs: Simply visit the default URL of your LangServe API. The FastAPI-generated documentation allows you to explore available endpoints, try out requests, and view the expected responses. This interactive documentation is an excellent way for developers to familiarize themselves with the API’s functionality and ensure their applications are correctly integrated.

Leveraging These Tools

The LangServe Playground and FastAPI documentation provides a robust set of tools for exploring, testing, and refining your AI API endpoints. They offer a hands-on approach to understanding how your AI models perform under different scenarios and how they can be integrated into broader applications. Whether you’re fine-tuning prompts in the Playground or diving into the technical details with the FastAPI docs, these resources are designed to enhance your development workflow and ship faster 🚀.

Next Steps and Resources

With your LangServe AI endpoint now deployed, the possibilities are endless. Explore other LangServe templates, join the LangChain community for support, and don’t hesitate to experiment with your newly acquired skills.

Resources:

  1. LangChain.com
  2. LangSmith.com
  3. LangServe
  4. LangChain Discord
  5. Be Happy and Learn - follow these X Accounts
    @GitMaxd
    @LangChainAI
    @hwchase17
    @Hacubu

If you’ve found this article useful, give me a like and a follow @ https://x.com/gitmaxd — I follow back!

LangChain’s LangServe

--

--

Rick Garcia

Linux Dev | AI Revolution Early Adopter | Love ReactJS/Next.js | Founder 3x startups (+$15mm) | Currently building amazing things at http://Slicie.com