Spacy + LangChain: A Powerful Duo For Your NLP Projects

FS Ndzomga
MLearning.ai
Published in
7 min readSep 5

Photo by NASA on Unsplash

If you’ve been passionate about NLP since before 2022, you’ve undoubtedly heard of SpaCy; you might have even developed projects using this powerful library. If you joined the community after 2022 with the emergence of ChatGPT, you’ve likely heard of LangChain as well. What if I told you that you can now combine the power of SpaCy and the versatility of LangChain at the same time?

Sounds fabulous, doesn’t it?

In this article, I’ll introduce you to how you can take advantage of these two libraries through spacy-llm, a new SpaCy package that leverages the power of the latest Large Language Models (LLMs) in its pipelines.

The spacy-llm package integrates LLMs into SpaCy pipelines, offering a modular system for rapid prototyping and prompting, and converting unstructured responses into robust outputs for various NLP tasks — no training data required.

It features a serializable llm component for integrating prompts into your pipeline, modular functions to define the task (prompting and parsing) and the model to use, support for hosted APIs and self-hosted open-source models, integration with LangChain, access to the OpenAI API, including GPT-4 and various GPT-3 models, built-in support for various open-source models hosted on Hugging Face, usage examples for standard NLP tasks such as Named Entity Recognition and Text Classification, and easy implementation of your own functions via the registry for custom prompting, parsing, and model integrations.

But why should you care? Why not just use LLMs like everyone else?

Large Language Models (LLMs) provide strong natural language understanding and can handle a variety of custom NLP tasks with little or no examples. However, supervised learning models are generally better than LLMs in a production environment in terms of efficiency, reliability, control, and accuracy.

The spacy-llm package allows you to combine the strengths of both LLMs and supervised models in a single pipeline. You can start with LLM-driven components for quick prototyping and then replace them with supervised learning models for production-level tasks as needed.

Introducing “DEPT” (Decomposed Prompt Tuning) : New PeFT Optimization Technique!

8 min read

Sep 14

5 Free High-Quality Courses to Study Generative AI and Large Language Models

3 min read

Jul 15

Mastering Generative AI: A Roadmap from Zero to Expertise in Gen AI field

10 min read

Sep 11

Revolutionizing Optimization: DeepMind Leverages Large Language Models as Intelligent Optimizers

3 min read

Sep 12

Fine-Tuning Large Language Models (LLMs)

14 min read

Sep 7

LangChain, LangSmith & LLM Guided Tree-of-Thought

6 min read

Sep 9

Weekly AI and NLP News — September 11th 2023

4 min read

Sep 11

PrAIde and Prejudice: Tracking and Minimize Political Bias in LLMs

12 min read

Aug 8

Generating applications from sketches with LLMs

7 min read

Sep 7

LLM Prompt Engineering for Developers — The Art and Science of Unlocking LLMs’ True Potential

4 min read

Sep 8

FS Ndzomga
MLearning.ai

Engineer passionate about data science, startups, product management, philosophy and French literature. Built https://www.rimbaud.ai