Build Your Own Assistant with Laravel Livewire — using your Data and Streamed Content

Stefano Novelli
6 min readAug 10, 2024

--

Have you ever dreamed of having your very own AI assistant? You know, like Jarvis from Iron Man or Alexa, but tailored specifically to your needs? Well, the good news is, with the power of Laravel Livewire, you can turn this dream into reality.

Copyright Berin Holy

You don’t need to be a coding wizard or have a PhD in computer science. All you need is a basic understanding of Laravel and a dash of enthusiasm. By the end of this tutorial, you’ll have a functional AI assistant capable of using your data, all crafted by your own hands.

Thanks to Claudio Emmolo, my co-dev in 3labs, for contributing!

So, why Laravel Livewire? It’s perfect for building real-time, interactive applications with less boilerplate code. Livewire bridges the gap between server-side and client-side programming, making it easier to manage the state of your application and create a smooth user experience. And, of course, expose your data to the web.

Get ready to dive into the world of AI, enhance your coding skills, and impress your friends and colleagues with your very own AI assistant. Let’s get started!

This is a recap of how everything works

Demo Repository

I’ve created a fully functional demo that you can use as a reference or even clone and experiment with!

👉 Open the repository

Requirements

  • Laravel 10.x or greater
  • OpenAI account (get your API key)

First of all, we are going to install the OpenAI Laravel Package in our Laravel project:

composer require openai-php/laravel

The whole installation procedure explained in the official repo follows:

php artisan openai:install

and we will configure the previously created OpenAI key in our .env:

OPENAI_API_KEY=sk-…

Remember to install also the Livewire v3 package:

composer require livewire/livewire

To make it nice we will install TailwindCSS and use it for the frontend part. Feel free not to do that, it won’t be the purpose of this document anyway.

Step 1: Educate your Assistant

When we talk about “educating” an AI assistant, we’re referring to the process of providing it with a set of files or documents that it will use to improve its responses.

Essentially, you upload these files into the system, and the assistant “reads” and analyzes them, learning from the information they contain. This enables the assistant to answer your questions more accurately and relevantly, drawing on the knowledge present in the files you’ve provided.

It’s similar to giving a human assistant a manual or guides: the more relevant information they have, the better they can assist you.

1.1 Prepare your data

In order to demonstrate the operation of our assistant, we will prepare the following:

  • A migration for the fruits table.
  • A Fruit model
  • A FruitSeeder seeder to create initial data

In addition, you can already create the OpenAI assistant that will host your code. Take note of your assistant_id.

1.2 File Search

We will use the File Search behavior our provided by OpenAI. There are some things to remember:

  • Only some kind of files are supported. Prefeer Markdown (.md) for long texts, JSON (.json) for Structured Data (we will use that)
  • Avoid PDF, it is heavier and more expensive than other formats.
  • Don’t bypass the file format with Code Interpreter; convert your data as described above.

1.3 Prepare your command

I’ll create a command called AiAssistantCommand with:

php artisan make:command AiAssistantCommand

It looks like this:

<?php

namespace App\Console\Commands;

use App\Models\Fruit;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\Storage;

use OpenAI\Laravel\Facades\OpenAI;

class AiAssistantCommand extends Command
{
/**
* The name and signature of the console command.
*
* @var string
*/
protected $signature = 'ai-assistant:educate';

/**
* The console command description.
*
* @var string
*/
protected $description = 'Make the world a better place with fruits!';

/**
* Execute the console command.
*/
public function handle()
{
$fileName = 'fruits.json';

$this->info('Converting your fruits to JSON...');
$fruits = Fruit::all()->toJson();

$this->info('Saving fruits to fruits.json...');
Storage::disk('local')->put($fileName, $fruits);

$this->warn('Educating the AI assistant...');
$assistant = OpenAI::assistants()->retrieve(config('openai_assistant.id'));

$this->warn('Deleting existing vector stores and files...');
if(isset($assistant->toolResources->fileSearch->vectorStoreIds))
{
foreach($assistant->toolResources->fileSearch->vectorStoreIds as $vectorStoreId) {
OpenAI::vectorStores()->delete($vectorStoreId);
}
}
$files = OpenAI::files()->list(['purpose' => 'assistants']);
foreach($files->data as $file) {
if($file->filename === $fileName) {
OpenAI::files()->delete($file->id);
}
}

$this->warn('Uploading fruits.json to OpenAI...');

$file = OpenAI::files()->upload([
'purpose' => 'assistants',
'file' => fopen(Storage::disk('local')->path($fileName), 'rb'),
]);

$this->warn('Creating a new vector store...');
$vectorStore = OpenAI::vectorStores()->create([
'file_ids' => [
$file->id,
],
'name' => 'fruits.vector.json',
]);

$this->warn('Modifying the AI assistant...');
$response = OpenAI::assistants()->modify(config('openai_assistant.id'), [
'tool_resources' => [
'file_search' => [
'vector_store_ids' => [
$vectorStore->id,
],
],
],
]);

$this->info('AI assistant has been educated!');

return 0;
}
}

Note: we are using a custom config openai_assistant for saving the id. You can also define a OPENAI_ASSISTANT_ID with your asst_id…

1.4 Schedule it

Since you cannot perform real-time education predict events to do this. For example you can use Observers to know when your data is updated, or make a Schedule every hour/day/week.

This depends on how often the files are changed, how heavy they are etc… Here are some ideas:

  • If you don’t care about real time, take advantage of a Schedule (maybe at night, so you load the server less)
  • If you want the real time, take advantage of an Observer (but be careful about the amount of data you are loading) or whatever you want

For a manual/test sync, you can simply use:

php artisan ai-assistant:educate

Before continue, you can try your assistant with the Playground.

After that, you’ll be able to chat with your assistant in the Playground.

Step 2: Stream your data

Now it’s the time to replicate the chat UI in your own project.

Let me remind you that this demo lacks some features, such as history, flush messages or other controls. It is only a demo to be used to start a project, do not use it in production.

Let’s go to create our livewire component:

php artisan livewire:make AiAssistant

Next, prepare our view and a class. Here some key concepts:

public function setThread(array $parameters = [])
{
$thread = OpenAI::threads()->create($parameters);
return $thread->id;
}

You need to create a thread for each chat session. Save your thread id in a variable to reuse in the componente.

public function sendMessage(): static
{
OpenAI::threads()->messages()->create($this->threadId, [
'role' => 'user',
'content' => $this->inputMessage,
]);

$this->streamAiResponse($this->inputMessage);

return $this;
}

The sendMessage() function will be triggered when the chat form is submitted. After sending the message, call the stream response.

public function streamAiResponse($message)
{
$stream = \OpenAI\Laravel\Facades\OpenAI::threads()->runs()->createStreamed(
threadId: $this->threadId,
parameters: [
'assistant_id' => config('openai_assistant.id'),
]);

$streamResponse = '';

foreach($stream as $response){
if($response->event == 'thread.message.delta') {
$this->stream(to: 'answer', content: $response->response->delta->content[0]->text->value);
$streamResponse .= $response->response->delta->content[0]->text->value;
}

if($response->event == 'thread.run.completed') {
$this->messages[] = [
'role' => 'assistant',
'content' => $streamResponse,
];

}
}
}

You need to loop the stream and fetch the delta messages.

We are using the wire:stream directive from Laravel Livewire to stream data in our view. That’s awesome!

This is the final result:

Slow to 0,1x — This is our demo chat. Scaffolding by Porter-smith

That’s all folks!

If this article saved your life please:

  • Clap 50x
  • Follow me on Medium
  • Leave a feedback or comment

See you soon !

--

--

Stefano Novelli

Web Developer ♥ IT Sec, Programming and Design ❏ Hacklog author, CTO Tom's Hardware IT & Spaziogames (3Labs), CEO Inforge.