Discovering Niche Ideas for Micro SaaS using CrewAI and Groq with Llama3

Tuan Truong
7 min readMay 17, 2024

--

The first things creators often struggle is often to find the right niche idea for their services. The best way is always to start with is finding a Micro SaaS, a subset of SaaS, focuses on small-scale, highly specialized applications that solve specific problems.

But how do you discover these niche ideas? Why not let's the AI do that for you. In this experiment, I have put together a practical application that help digging through Reddit, Search Engine in order to find the niche ideas.

In this blog post, I will share how to build this CrewAI application that leveraged to generate niche ideas for Micro SaaS. We’ll delve into the application structure, the tech stack, and provide a step-by-step guide to building the application. By the end, you’ll have a comprehensive understanding of how to kickstart your own CrewAI agents.

Application Structure

To get started, let’s take a look at the application structure. Below is a diagram that outlines the key components of our application:

Tech Stack Overview

To build this application, we’ll be using a variety of tools and technologies:

  • Groq: For serving the large language model (LLM).
  • 8B Llama3 Model: The core model used for generating ideas.
  • CrewAI: Utilized with LangChain tools for managing multiple agents.
  • Praw: For scraping data from Reddit.
  • Pytrends: For accessing Google Trends data.
  • DuckDuckGo: For competition analysis.

Step-by-Step Guide to Building the Application

1. Setting Up the Environment

First, we need to install the necessary libraries and tools. Here’s how you can set up your development environment:

pip install groq llama3 crewai praw pytrends duckduckgo streamlit

2. Get the LLM Model set up

    # Set up the customization options
st.sidebar.title('Customization')
model = st.sidebar.selectbox(
'Choose a model',
['llama3-8b-8192', 'mixtral-8x7b-32768', 'gemma-7b-it','llama3-70b-8192']
)
llm = ChatGroq(
temperature=0,
groq_api_key = st.secrets["GROQ_API_KEY"],
model_name=model
)

3. Creating Custom Tools

Next, we’ll create custom tools needed for our application. Below is an example of a custom tool for scraping Reddit using Praw:

import json
import os
import praw
from datetime import datetime
from helper_tools import remove_emojis
from crewai_tools import BaseTool
import streamlit as st

class RedditTrends(BaseTool):
name: str = "Reddit Trends"
description: str = "Fetches the latest trends from our favorite subreddits."

def _run(self, subreddits=None) -> dict:
"""
Executes the Reddit API to scrape the top posts and their best two comments from specified subreddits.

Parameters:
subreddits (list of str): Optional; a list of subreddit names to scrape. If not provided, the function defaults to
scraping posts from 'selfhosted', 'homelab', 'HomeNetworking', and 'HomeServer'.
A maximum of three subreddits can be specified at a time.

Returns:
dict: A dictionary where each key is a subreddit and the value is a list of the top posts from that subreddit,
each post accompanied by its top two comments.

Notes:
Ensure that the subreddit names are correctly spelled and are existing subreddits on Reddit. The function is
limited to scraping no more than three subreddits at once to maintain performance and adhere to API usage guidelines.
"""
return self.scrape_reddit(subreddits)

def scrape_reddit(self, subreddits=None):
"""
Executes the Reddit API to scrape the top posts and their best two comments from specified subreddits.

Parameters:
subreddits (list of str): Optional; a list of subreddit names to scrape. If not provided, the function defaults to
scraping posts from 'selfhosted', 'homelab', 'HomeNetworking', and 'HomeServer'.
A maximum of three subreddits can be specified at a time.

Returns:
dict: A dictionary where each key is a subreddit and the value is a list of the top posts from that subreddit,
each post accompanied by its top two comments.

Notes:
Ensure that the subreddit names are correctly spelled and are existing subreddits on Reddit. The function is
limited to scraping no more than three subreddits at once to maintain performance and adhere to API usage guidelines.
"""

print("Scraping Reddit for the latest trends...")
# Setup Credentials
reddit = praw.Reddit(
client_id=st.secrets["REDDIT_CLIENT_ID"],
client_secret=st.secrets["REDDIT_CLIENT_SECRET"],
user_agent="aiquill by /u/tuantruong84",
)

# Start up with these subreddits
if subreddits is None:
subreddits = ["Startup_Ideas", "startups", "Entrepreneur"]

if len(subreddits) > 3:
raise Exception("Maximum of 3 subreddits at the time.")

print(f"Scraping Reddit for the latest trends from {subreddits}...")
max_amount_of_posts = 3

scrapped_reddit_data = {}
for subreddit in subreddits:
sub = reddit.subreddit(subreddit)

for post in sub.hot(limit=max_amount_of_posts):
posts = {
"title": remove_emojis(post.title),
"url": post.url,
"score": post.score,
# "description": post.selftext,
"comments": [],
"created": datetime.fromtimestamp(post.created_utc).strftime("%Y-%m-%d %H:%M:%S"),
}

try:
post.comments.replace_more(limit=0)
comments = post.comments.list()[:5]

for comment in comments:
posts["comments"].append(remove_emojis(comment.body))

scrapped_reddit_data.setdefault(sub.display_name, []).append(posts)

except praw.exceptions.APIException as e:
print(f"API exception occurred {e}")
print("scraping done!",scrapped_reddit_data)
return scrapped_reddit_data

3. Creating Agents

Agents play a crucial role in our application. Here’s how you can create an agent using CrewAI:

niche_analyst = Agent(
role="Niche Analyst",
goal="Find inspiring SASS ideas from specified subreddits ",
backstory="""You are an AI tasked with continuously monitoring specified subreddits to identify trending discussions around SaaS ideas.
Your discoveries will lay the groundwork for further market analysis and MVP feature recommendations.""",
tools=[reddit_trends], # Assuming reddit_trends is an object of RedditTrends
verbose=True,
allow_delegation=False,
max_iter = 3,
llm=llm,
)


# Competitor Analysis Agent for identifying similar SaaS products
competitor_analyst = Agent(
role="Competitor Analyst",
goal="Identify existing competitors for the trending SaaS ideas, and analyze their strengths and weaknesses",
backstory="""You dive deep into the web to find existing SaaS solutions that match the ideas found. Your research helps in understanding the competitive landscape, highlighting the potentials.""",
tools=[search_tool], # Assuming competitor_analysis is an object of CompetitorAnalysis
llm=llm,

)

# Feature Analyst Agent for MVP feature suggestions
feature_analyst = Agent(
role="Feature Analyst",
goal="Suggest potential features for MVP based on the compiled analysis",
backstory="""With the insights provided by the Market and Competitor Analysts, you suggest a possible feature set for the MVP. Your goal is to craft a compelling value proposition for future development.""",
llm=llm,
verbose=True,
allow_delegation=True,
)

# Task for Trend Analyst to scrape trending SaaS ideas from specified subreddits
niche_analysis_task = Task(
description=f""" Based on these subreddit : {subreddit}.
Scrape specified subreddits for trending discussions around SaaS ideas. Focus on identifying emerging trends, popular discussions, and the most engaging content related to SaaS products.
""",
expected_output=f"""
Maxium of 10 SASS ideas containing the specific idea, including the problem, solution and a brief overview of the discussion around idea.
This list will serve as the foundation for further analysis, list are concise and easy to follow. List should be concise and easy to follow.
""",
agent=niche_analyst,
async_execution=False,
)

# Task for Competitor Analyst to conduct an in-depth analysis of existing solutions
competitor_analysis_task = Task(
description="""
Conduct a detailed analysis of existing competitors for the sass ideas.
""",
expected_output="""
A concise competitor analysis for each idea, listing major competitors, their key features, pricing. Highlight any gaps or opportunities for innovation, or problems to solve.
""",
agent=competitor_analyst,
async_execution=False,
context=[niche_analysis_task],
)

4. Creating Tasks

Tasks are essential for organizing the workflow. Here’s an example of creating a task:

# Task for Trend Analyst to scrape trending SaaS ideas from specified subreddits
niche_analysis_task = Task(
description=f""" Based on these subreddit : {subreddit}.
Scrape specified subreddits for trending discussions around SaaS ideas. Focus on identifying emerging trends, popular discussions, and the most engaging content related to SaaS products.
""",
expected_output=f"""
Maxium of 10 SASS ideas containing the specific idea, including the problem, solution and a brief overview of the discussion around idea.
This list will serve as the foundation for further analysis, list are concise and easy to follow. List should be concise and easy to follow.
""",
agent=niche_analyst,
async_execution=False,
)

# Task for Competitor Analyst to conduct an in-depth analysis of existing solutions
competitor_analysis_task = Task(
description="""
Conduct a detailed analysis of existing competitors for the sass ideas.
""",
expected_output="""
A concise competitor analysis for each idea, listing major competitors, their key features, pricing. Highlight any gaps or opportunities for innovation, or problems to solve.
""",
agent=competitor_analyst,
async_execution=False,
context=[niche_analysis_task],
)

# Task for Feature Analyst to outline potential MVP features
mvp_feature_suggestion_task = Task(
description="""
Based on the comprehensive analysis provided by the trend and competitor analysis, suggest potential features for each idea. Focus on unique selling points and core functionalities written in concise format.
""",
expected_output="""
A report on top SASS ideas with brief description and market potential, and suggested MVP features for each idea. Report should be in formatted markdown.
""",
agent=feature_analyst,
async_execution=False,
context=[competitor_analysis_task],
)

5. Integrating Everything into AI Crew

Now, let’s combine our custom tools, agents, and tasks into the AI

crew = Crew(
agents=[niche_analyst, competitor_analyst,feature_analyst],
tasks=[niche_analysis_task, competitor_analysis_task, mvp_feature_suggestion_task],
verbose=2,
process=Process.sequential,
full_output=True,
)

6.Kicking Off the AI Crew

result = ""
result_container = st.empty()
for delta in crew.kickoff():
result += delta # Assuming delta is a string, if not, convert it appropriately
result_container.markdown(result)

7. Example Scenarios and Use Cases

  • Scenario 1: Discovering niche ideas for productivity tools.
  • Scenario 2: Identifying gaps in the market for educational apps.

8. Initial Results and Observations

Upon running the application, you’ll notice a list of niche ideas generated based on the data collected. These ideas can serve as a starting point for your Micro SaaS venture.

Surprising End Results

One of the most surprising results was the identification of a niche market for mental health apps tailored for remote workers. This insight opens up new possibilities for developing specialized applications that cater to this growing demographic.

Potential and Ideas for Future Improvements

  • Enhanced Data Processing: Implementing more advanced data processing techniques to improve the quality of generated ideas.
  • User Feedback Loop: Incorporating user feedback to refine and improve the idea generation process.

Sharing the Application

Our application is open-source, and you can access it here. Feel free to explore, use, and contribute to the project.

Conclusion

In this blog post, we’ve explored how CrewAI and Llama3 can be used to generate niche ideas for Micro SaaS. From setting up the environment to integrating various components, we’ve covered everything you need to get started. The potential of this application is immense, and we encourage you to dive in, explore, and contribute to this exciting open-source project.

Link to live version : https://hiddengem.streamlit.app/

Link to GitHub : https://github.com/Ai-Quill/hiddengem

If you are interested in following more about AI and different tutorials and insights, my x profile is at https://x.com/tuantruong

Happy coding!

--

--

Tuan Truong

Co-founder at health tech company, writer, dreamer, wanderer, father, finding small beauty in the everyday.