Llama 3.1 tutorials for beginners

Codes, LangChain integration, Ollama, and others

Mehul Gupta
Data Science in your pocket

--

Photo by Paul Lequay on Unsplash

The biggest news of the hour, Meta’s fully open-sourced LLM, Llama 3.1 is out and is out with a bang !

We will be quickly walking through the below point in this post:

Llama 3.1 features

How to use Llama 3.1?

Llama 3.1 using LangChain

Llama 3.1 using Ollama

Llama 3.1 multi-modal capabilities

Without wasting time, let’s summarize some of its key features:

  • Largest Openly Available Model: Llama 3.1 405B is the world’s largest and most capable openly available foundation model, with over 300 million total downloads across all Llama versions.
  • State-of-the-Art Capabilities: The model excels in general knowledge, steerability, math, tool use, and multilingual translation, rivaling top AI models like GPT-4 and Claude 3.5 Sonnet.
  • Upgraded Smaller Models: Includes upgraded versions of the 8B and 70B models with a significantly longer context length of 128K, improved multilingual capabilities, and stronger reasoning.
  • Extensive Evaluations: Evaluated on over 150 benchmark datasets and extensive human evaluations, showing competitive performance across a range of tasks.
  • Optimized Training: Trained on over 15 trillion tokens using over 16,000 H100 GPUs, with significant optimization of the training stack.
  • Quantization for Efficiency: Models are quantized from 16-bit to 8-bit numerics, reducing compute requirements and enabling the model to run on a single server node.
  • Instruction and Chat Fine-Tuning: Enhanced for detailed instruction-following and high levels of safety in responses. The Instruct models are available for each of the 3 versions
  • Open Source and Customizable: Available for download and development on platforms like Hugging Face, with a license allowing the use of outputs to improve other models. You just need to get a approval (gated model)

My debut book : LangChain in your Pocket is out now

Below are some key tutorials for you to get started on it

How to use Llama 3.1 ?

The below tutorial explains how to use Llama 3.1 in python and build basic applications

Llama 3.1 with LangChain

LangChain, being the most important framework for Generative AI applications, also provide integration for this model. The code explanation for Llama 3.1 integration with LangChain can be found below

How to chat with Llama 3.1 (chat UI)?

All the three models are available for free to chat on HuggingFace Spaces. The below demo shows how to chat with Llama 3.1 without paying a penny

Llama 3.1 using Ollama

Ollama has been the goto tool for offline LLM chatting for me. Llama3.1 is now available on Ollama as well. Checkout how it can be enabled in offline, local system using Ollama

Llama 3.1 multimodal capabilities

You can try Llama 3.1 even in Meta.ai playground and test it multimodal capabilities

I hope you try out Llama 3.1 as it looks to be a bomb. For a starter, you can go with 8B-Instruct model which should be compatible with most of the systems. The performance for even the smallest model is pretty good.

With this, it’s a wrap !

--

--