Case Study: Solving Vercel’s 10-Second Limit with QStash

Kolby Sisk
3 min readJan 23, 2024

--

Recently I ran into a common problem with Vercel: Serverless functions have a timeout of only 10 seconds (300 seconds if you upgrade to pro plan). I had a process that took roughly 15 minutes, so I was obviously hitting this limit. I could have deployed a custom API on Heroku that doesn’t have these limits, but I came up with a much better solution. It required a shift in mindset — a shift to thinking serverlessly.

TL;DR

I broke my function up into smaller serverless functions, and used a message queue (QStash) to execute the functions in parallel. Each function finishes in less than 10 seconds, and since they run in parallel my 15 minute process now finishes in 20 seconds. 🤯

The initial architecture

To understand the solution, let’s look at the problem. The initial architecture had a single function.

async function createPostsWithSentiment(q: string) {
const redditPosts = await reddit.search(q);
const prompts = createPrompts(redditPosts);
const completionPromises = prompts.map((prompt) => completion(openai, prompt));
const completionResponses = await Promise.all(completionPromises);
await savePostsWithSentiment(redditPosts, completionResponses);
}

The function would:

  1. Search Reddit for comments
  2. Run sentiment analysis on each comment with OpenAI
  3. Save the results

The initial architecture had all of this happening in a single function, and since Node is single threaded, the best we can optimize for is concurrency. Obviously, waiting for 100 OpenAI requests to finish was going to take awhile on a single thread.

The new architecture

After giving it some thought, I had an aha moment — I wasn’t using the power of serverless. Serverless functions are on-demand and each run their own independent execution environment. This architecture means that instead of trying to cram all my processes into one function, I could break them down into smaller, more manageable units. Each of these units could then be executed in parallel, not only circumventing the 10-second execution limit but also enhancing overall performance and scalability. It was a game-changer in terms of how I approached my application’s architecture.

async function createPostsWithSentiment(q: string) {
const redditPosts = await reddit.search(q);

for(const redditPost of redditPosts) {
// for each post queue a request to my api route with QStash
await fetch(`${qstashUrl}${appUrl}/api/analyze-sentiment-and-save`, {
method: 'POST',
body: JSON.stringify({ redditPost }),
headers: {
Authorization: `Bearer ${qstashToken}`,
'Upstash-Retries': '3',
},
});
}
}
// /app/api/analyze-sentiment-and-save/route.ts

import { verifySignatureAppRouter } from '@upstash/qstash/dist/nextjs';

function handler(req: NextRequest) {
const { redditPost } = await req.json();
const prompt = createPrompt(redditPost);
const completionResponse = await completion(openai, prompt);
await savePostWithSentiment(redditPost, completionResponse);
}

export const POST = verifySignatureAppRouter(handler);

With this simple change the process went from 15 minutes to complete, down to 20 seconds. This is thanks to the power of serverless and parallel processing.

In my code example you’ll notice that I’m using a message queue to managing calling my serverless functions. While I could have just called my function directly, I decided to use a message queue to handle the minutiae that comes with executing a serverless function. I did extensive research and tried a few solutions, and the best option I found — hands down — was QStash.

QStash is designed for serverless runtimes and will handle the details of gluing serverless functions together like:

Another benefit is that QStash is just one of the awesome products offered by Upstash, and if I ever need Redis or Kafka features it would all integrate together nicely. Not to mention it’s incredibly affordable at 500 messages a day for free, and then just $1 for 100K additional messages.

Conclusion

The journey to solve Vercel’s 10-second limit led to an innovative and efficient solution leveraging serverless architecture and QStash’s powerful message queue capabilities. This case study highlights the importance of thinking outside the box and embracing serverless technologies to overcome limitations and enhance application performance. By breaking down a lengthy process into smaller, parallel-executing functions, I not only bypassed the time constraints but also significantly reduced the total processing time from 15 minutes to an astonishing 20 seconds.

--

--

Kolby Sisk

Builder of software with a passion for learning. I specialize in web development and user experience design. www.kolbysisk.com