Modern Full-Stack Web3 Development (For 2024)

Glen
16 min readDec 21, 2023

--

Introduction

Full-stack development in web3 is still in its infancy compared to traditional web development. Since crypto platforms don’t use the traditional database model (users interact with a database backend via a REST API etc), web2-oriented tooling is insufficient for web3, and our ecosystem has to reinvent certain wheels to achieve a unified and interoperable testing and production environment. Until now, mature frameworks haven’t existed — we’ve relied on piecemeal and ad hoc libraries that often have no typing or different typing for crypto primitives like bytes and bigints.

Do we convert uint256s to strings, convert them to native javascript bigints, or wrap as BigNumberish? Do we standardize storing addresses in databases as their checksummed value or as lowercase? Do we spin up a local forked chain or test on goerli — and how do we index a locally forked chain to write smart contracts and send transactions in a development environment? There has not been a consensus reached for these problems practically. So, if we are using a frontend library and backend library that chose different answers and thus don’t play well together, we eventually have to fight against our codebase to get things working.

I’ll present a somewhat-opinionated approach to modern (as of late-2023) web3 full-stack development. I’ll demonstrate how to fit together smart contract development, frontend development, and indexing in an ergonomic way. For this, we’ll be using foundry for the smart contract development framework, anvil to spin up local testnets for testing, wagmi, viem, and react/nextjs for frontend dev, ponder for indexing, and rivet for a development-oriented wallet.

The Project

We’ll create a dApp that lets users increment a counter by sending an increment(uint256) transaction. They can then view every transaction that has been sent, and see what the counter's current value is. They can increment by any number 1 through 10.

Each transaction will need to be sent with at least a 0.001ETH fee — the user can choose to send more. The smart contract can look like this:

// SPDX-License-Identifier: MIT
pragma solidity ^0.8.22;

import "solmate/auth/Owned.sol";

contract TippableCounter is Owned {
uint256 counter;

event Incremented(address indexed sender, uint8 incrementedBy, uint256 ethSent);

constructor() Owned(msg.sender) {}

function increment(uint8 incrementBy) external payable {
require(msg.value >= 0.001 ether, "Fee required.");
require(incrementBy >= 1 && incrementBy <= 10, "incrementBy not in range.");
counter += incrementBy;

emit Incremented(msg.sender, incrementBy, msg.value);
}

function withdraw() external onlyOwner {
counter = 0;
(bool sent,) = address(owner).call{value: address(this).balance}("");
}
}

You’ll notice that the Incremented event doesn't emit the counter's new value, only how much it incremented by - we'll have to figure out a way to track it and display it. We'll get to that later! Tracking by delta or by new value is often a design choice that depends on the project needs. You could also do both - up to you.

Foundry

First, if you don’t have foundry installed, then follow the instructions here: https://book.getfoundry.sh/getting-started/installation. It's basically the same as Rust's rustup flow. The Foundry book is the source of truth on foundry and all the subcommands like forge and cast.

To manage your solc version, you can use https://github.com/alloy-rs/svm-rs, which is like Node's nvm. Anything above 0.8.1 is alright since 0.8.x resolves problems with safe math, but we'll be working with 0.8.22, the latest version. It's good to follow changelogs so you know what new features and bugfixes were introduced.

svm install 0.8.22

Init a new project: forge init tippableCounter && cd tippableCounter

We’ll use the solmate helper lib, but you could also use OpenZeppelin - both are good choices. forge install transmissions11/solmate

Copy and paste the smart contract above into src/TippableCounter.sol and feel free to remove Counter.sol-associated files from src/, script/, and test/.

To build contracts, run forge build. If any build errors occur at this point, make sure that you're on the latest foundry version (run foundryup to upgrade), and svm use 0.8.22.

You’ll see something like

❯ forge build
[⠒] Compiling...
[⠘] Compiling 25 files with 0.8.22
[⠊] Solc 0.8.22 finished in 2.67s
Compiler run successful!

This is good!

Testing

With smart contracts, it is crucial to have a robust testing environment. The basic principle is unit testing — to make sure that every function successfully executes when it should, and to make sure they all fail when they should (increment by 11? It should always fail!).

There are more advanced testing features in foundry that are beneficial when ensuring more complex functionality works as it should. These are "Fuzz testing" and "Invariant testing". We'll touch on fuzz testing, but invariant testing is out of scope for this smart contract.

Let’s start with basic unit testing (test/TippableCounter.t.sol):

// SPDX-License-Identifier: UNLICENSED
pragma solidity ^0.8.13;

import "forge-std/Test.sol";
import "../src/TippableCounter.sol";

contract CounterTest is Test {
TippableCounter tippableCounter = new TippableCounter();
address alice = address(1);
address bob = address(2);

function setUp() public {
// initialize account balances
vm.deal(alice, 10 ether);
vm.deal(bob, 10 ether);
}

function test_increment() public {
vm.prank(alice);
tippableCounter.increment{value: 0.01 ether}(1);
require(tippableCounter.counter() == 1);
require(address(tippableCounter).balance == 0.01 ether);
}

function testFail_incrementTooHigh() public {
vm.prank(alice);
tippableCounter.increment{value: 0.01 ether}(11);
vm.expectRevert("incrementBy not in range.");
}

function testFail_incrementNoValue() public {
vm.prank(alice);
tippableCounter.increment{value: 0 ether}(1);
vm.expectRevert("Fee required.");
}
}

Run this with forge test, and if you want to see all the function calls that are made, run forge test -vvvv. To only see stack traces for failing tests, run -vvv.

The problem with test_increment is that we're only testing one value, when of course there's a range of ten values that could be passed in. Let's write a fuzz test.

function testFuzz_increment(uint256 amount) public {
vm.prank(alice);
tippableCounter.increment{value: 0.01 ether}(amount);
}

If you run this test (forge test --match-test testFuzz_increment), you'll see that it fails:

forge test --match-test testFuzz_increment
...
Failing tests:
Encountered 1 failing test in test/TippableCounter.t.sol:CounterTest
[FAIL. Reason: revert: incrementBy not in range.; counterexample: calldata=0x92c77665000000000000000000000000000000000000000000000000000000000000000b args=[11]] testFuzz_increment(uint256) (runs: 3, μ: 41158, ~: 41158)

Once it tries to increment by 11, it fails. Expected behavior! But let’s tell foundry to limit the range to between 1–10.

function testFuzz_increment(uint256 amount) public {
vm.assume(amount >= 1);
vm.assume(amount <= 10);
vm.prank(alice);
tippableCounter.increment{value: 0.01 ether}(amount);
}

Now if you run it, it’ll pass. It is testing every integer between 1–10!

Deploying

Let’s move on to scripting and deploying our smart contract.

There are two ways to deploy a smart contract from foundry. The first way is with forge create; the second is with a script that programmatically deploys and initializes the smart contract with (optional) additional transactions. This also lets you simulate transactions before you broadcast them to make sure everything will init correctly.

Let’s create script/TippableCounter.s.sol:

// SPDX-License-Identifier: UNLICENSED
pragma solidity ^0.8.13;

import "forge-std/Script.sol";
import "../src/TippableCounter.sol";

contract CounterScript is Script {
function setUp() public {}

function run() public {
vm.startBroadcast();
TippableCounter tippableCounter = new TippableCounter(msg.sender);
vm.stopBroadcast();
}
}

Running forge script script/TippableCounter.s.sol -vvvvv:

❯ forge script script/TippableCounter.s.sol -vvvvv
[⠢] Compiling...
No files changed, compilation skipped
Traces:
[98] CounterScript::setUp()
└─ ← ()

[300151] CounterScript::run()
├─ [0] VM::startBroadcast()
│ └─ ← ()
├─ [243334] → new TippableCounter@0x34A1D3fff3958843C43aD80F30b94c510645C316
│ ├─ emit OwnershipTransferred(user: 0x0000000000000000000000000000000000000000, newOwner: DefaultSender: [0x1804c8AB1F12E6bbf3894d4083f33e07309d1f38])
│ └─ ← 1096 bytes of code
├─ [0] VM::stopBroadcast()
│ └─ ← ()
└─ ← ()

With this simulated run of the script, we can see that the TippableCounter contract is created, and ownership is transferred to the deployer address. Looks right!

To seed usage, let’s also send an increment transaction. Add tippableCounter.increment{ value: 0.01 ether}(1); to the script.

Anvil

Running the anvil command will spin up a local chain that isn't forked from any other chain - a blank slate. It'll list addresses that it has init'd with ETH balances so you can use them right away.

❯ anvil


_ _
(_) | |
__ _ _ __ __ __ _ | |
/ _` | | '_ \ \ \ / / | | | |
| (_| | | | | | \ V / | | | |
\__,_| |_| |_| \_/ |_| |_|

0.2.0 (477b345 2023-12-18T00:29:33.930941000Z)
https://github.com/foundry-rs/foundry

Available Accounts
==================

(0) "0xf39Fd6e51aad88F6F4ce6aB8827279cffFb92266" (10000.000000000000000000 ETH)
(1) "0x70997970C51812dc3A010C7d01b50e0d17dc79C8" (10000.000000000000000000 ETH)
(2) "0x3C44CdDdB6a900fa2b585dd299e03d12FA4293BC" ...

Private Keys
==================

(0) 0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80
(1) 0x59c6995e998f97a5a0044966f0945389dc9e86dae88c7a8412f4603b6b78690d
(2) 0x5de4111afa1a4b94908f83103eb1f1706367c2e68ca870fc3fb9a804cdab365a
...

Let’s use the first account to deploy the contract.

forge script script/TippableCounter.s.sol --broadcast --private-key 0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80 --rpc-url http://127.0.0.1:8545

...


✅ [Success]Hash: 0x7b2a2621ec15710af7b65e69be1ee81bb2b677eb5def2f287c1900aa916dd75c
Contract Address: 0x5FbDB2315678afecb367f032d93F642f64180aa3
Block: 1
Paid: 0.001266168 ETH (316542 gas * 4 gwei)


##### anvil-hardhat
✅ [Success]Hash: 0x25a494449c09bbfeb6f74a8bfa8d2c0cc5b31698e30718b0a0c5562e8a00d99b
Block: 2
Paid: 0.000176122311147 ETH (45420 gas * 3.87763785 gwei)


==========================

ONCHAIN EXECUTION COMPLETE & SUCCESSFUL.
Total Paid: 0.001442290311147 ETH (361962 gas * avg 3.938818925 gwei)

0x5FbDB2315678afecb367f032d93F642f64180aa3 is the address of our deployed TippableCounter! Exciting. The second transaction has sent an increment function, and we can see if counter is 1 by running

cast call 0x5FbDB2315678afecb367f032d93F642f64180aa3 "counter()" --rpc-url http://127.0.0.1:8545

Which will return with 0x0000000000000000000000000000000000000000000000000000000000000001. It worked!

Now that anvil is running in the background, it’s time for act two — frontend development and indexing.

Rivet

First, make sure that Rivet is installed: https://github.com/paradigmxyz/rivet. Once installed it should pick up your Anvil instance automatically.

Rivet UI

Since Rivet doesn’t assume that transactions sent and on-chain data are permanent states, it makes development easier because we’re going to, during the frontend development process, be restarting anvil many times. Metamask and other wallets get confused when you reset a chain, so you have to remove and re-add the chain for it to get the point. It makes development non-ergonomic, and makes the iteration process very slow. In Rivet, all you have to do is click the 'reset' button to reset the state completely.

Now that we have Rivet set up as a browser wallet, let’s get started on the website.

wagmi, nextjs, and connectkit

For many years, ethers.js was the library of choice for web3 dApps. It was framework-agnostic, which was good for using it across many stacks, but bad because it was hard to use idiomatically for specific frameworks. The standard frontend framework has always been React, but recently Nextjs has taken the lead because you can easily host the sites on vercel and can write backend apis without transitioning into a different ecosystem or having to run your own backend api server.

wagmi resolved the problems ethers.js has: wagmi is strongly-typed and follows the same idioms as React (hooks, providers, query caching). viem was then released, which provides a closer-to-the-metal interface for RPC calls, typing, and various utilities for things like ABI parsing.

Now, to the point. wagmi is a library to interact with smart contracts, on-chain calls, and the user's wallet. In order to do this, though, we have to use connectors to select which wallet type we want to be able to connect to. You can do this in wagmi directly but recently there are very nice libraries that handle all of the UI hassle for us. ConnectKit and RainbowKit are the nicest looking (in my opinion) and are agnostic in terms of which ecosystem they want you to use for more advanced features like 'social login' stuff.

Let’s initialize a nextjs project that uses wagmi and connectkit to spin up a basic site where you can connect your wallet. Wagmi has a helper scaffolding tool that we'll use -

pnpm create wagmi --template next-connectkit

And then start the server with yarn dev.

We’ll need to add foundry as a network option:

src/wagmi.ts

import { getDefaultConfig } from "connectkit";
import { foundry } from "viem/chains";
import { configureChains, createConfig } from "wagmi";
import { publicProvider } from "wagmi/providers/public";

const walletConnectProjectId = "";
const { chains, publicClient } = configureChains([foundry], [publicProvider()]);
export const config = createConfig(
getDefaultConfig({
autoConnect: true,
appName: "My wagmi + ConnectKit App",
walletConnectProjectId,
chains: chains,
publicClient: publicClient,
})
);

You should be able to connect through Rivet via the ‘Browser Wallet’ option. There are some cool cheats you can set to skip some clicking as well.

It’s a good exercise, if you’re not familiar with wagmi, to read through all the components it comes bundled with so you can get a sense of how to use wagmi's hooks idiomatically.

Now, let’s make two components that interacts with our smart contract — one to read and display the current counter value, and the other to send an increment transaction. Let's wipe most of the main page so we can start with a clean slate. src/components/ReadCounterValue.tsx

"use client";

import { useContractRead } from "wagmi";
import { parseAbi } from "viem";

export function ReadCounterValue() {

const { data } = useContractRead({
address: "0x5FbDB2315678afecb367f032d93F642f64180aa3",
abi: parseAbi(["function counter() view returns (uint256)"]),
functionName: "counter",
watch: true
});

return (
<div>
<h2>Counter Value</h2>
<p>{data.toString()}</p>
</div>
);
}

wagmi infers the type here from the abi, so you can be certain that data will be a bigint, since the counter() function returns a uint256.

src/components/SendIncrementTransaction.tsx

"use client";

import { useState } from "react";
import { parseAbi } from "viem";
import { useContractWrite, usePrepareContractWrite } from "wagmi";

export default function SendIncrementTransaction() {
const [incrementAmt, setIncrementAmt] = useState(1);

const { config } = usePrepareContractWrite({
address: "0x5FbDB2315678afecb367f032d93F642f64180aa3",
abi: parseAbi([
"function increment(uint8 incrementBy) external payable",
]),
functionName: "increment",
args: [BigInt(incrementAmt)],
value: BigInt(10000000000000000),
});

const { write } = useContractWrite(config);

return (
<div>
<input
type="number"
min={1}
max={10}
step={1}
onChange={(e) => {
setIncrementAmt(Number(e.target.value));
}}
value={incrementAmt}
/>
<button onClick={write}>Send</button>
</div>
);
}

Clicking send will trigger this transaction — in the calldata you can see that it’s sending 3 as the function argument. Since we set watch: true in the useContractRead hook, the counter value will automatically update once the transaction is sent, since it is watching for new blocks and then refetching the value.

Cool! So now we have contract reading and writing. Best practices include handling errors before the transaction is sent via the prepare hook, as well as after via the useContractWrite and useWaitForTransaction hooks. useWaitForTransaction is a useful hook because useContractWrite will return a success without waiting for the transaction to confirm, which only means it was sent correctly. But with transactions that depend on changing on-chain state that you may not be privy too (such as a DEX swap transaction going over the max slippage between the time the transaction was sent and when it was included in a block), we need to make sure transactions were actually included instead of reverting once sent.

In Rivet we can inspect the transaction, including the logs that were sent. This is what we’re going to index and store in our database.

Indexing with Ponder

Indexing on-chain data has always been a pain — it has required using complex and expensive infrastructure and, once stored, is only accessible through GraphQL. The standard for a long time was The Graph, which is a decentralized service where users run indexing nodes that run the code that you publish. They have their own token, so in production you are charged a non-USD fee per x amount of requests. It requires total buy-in on the tech side and the billing side, and data access is always only through GraphQL. Direct database access to inspect things isn’t possible unless you run your own private graph node, which is costly to maintain and doesn’t scale well across multiple chains. Since we’re doing local development, we’d need to run a local graph node that either hooks into anvil or an archive node, which is also costly. We're already looking at $100+/mo for indexing if we go down that route.

Now there is a cottage industry of hosted graph indexing platforms, but it doesn’t solve the problem completely because fundamentally The Graph-based indexing isn’t flexible enough for us. Since we want to restart the anvil chain many times and no transactions are set in stone, we would also have to reset our graph-nodes database and wait for it to index again. It's finicky, CPU-intensive, and takes a lot of time. From experience, you have to fight it every step of the way because you're using it in a way it's not structured for.

Enter Ponder (https://ponder.sh) — it’s a lightweight indexing tool that is built for dApp development, you can plug-n-play your own database and clear the state very quickly. It’s intended for the development process and, even though it’s early days, it’s very simple to use.

Let’s create a ponder project — pnpm create ponder.

You can run it in your nextjs project directory and it’ll make a subdirectory for you.

First, we’ll copy over our abi to abis/TippableCounter.ts:

export const TippableCounter = [
{
type: "function",
name: "counter",
inputs: [],
outputs: [
{
name: "",
type: "uint256",
internalType: "uint256",
},
],
stateMutability: "view",
},
{
type: "event",
name: "Incremented",
inputs: [
{
name: "sender",
type: "address",
indexed: true,
internalType: "address",
},
{
name: "incrementedBy",
type: "uint256",
indexed: false,
internalType: "uint256",
},
{
name: "ethSent",
type: "uint256",
indexed: false,
internalType: "uint256",
},
],
anonymous: false,
},
{
type: "event",
name: "OwnershipTransferred",
inputs: [
{
name: "user",
type: "address",
indexed: true,
internalType: "address",
},
{
name: "newOwner",
type: "address",
indexed: true,
internalType: "address",
},
],
anonymous: false,
},
{
type: "event",
name: "Reset",
inputs: [],
anonymous: false,
},
];

and create a schema for the database.

ponder.schema.ts

import { createSchema } from "@ponder/core";

export default createSchema((p) => ({
Event: p.createTable({
id: p.string(),
transactionHash: p.bytes(),
sender: p.bytes(),
incrementedBy: p.bigint(),
ethSent: p.bigint(),
}),
}));

Then we’ll set up ponder’s config:

import { createConfig } from "@ponder/core";
import { http } from "viem";

import { TippableCounter } from "./abis/TippableCounter";

export default createConfig({
networks: {
anvil: {
chainId: 31337,
transport: http("http://127.0.0.1:8545"),
},
},
contracts: {
TippableCounter: {
network: "anvil",
abi: TippableCounter,
address: "0x5FbDB2315678afecb367f032d93F642f64180aa3",
startBlock: 0,
},
},
});

It’s a very simple config — we import our ABI and configure the contract. Since Ponder uses eth_getLogs with filters, we pass in the address and could optionally pass in the event signature.

And finally we’ll create the actual indexing script.

src/index.ts

import { ponder } from "@/generated";

ponder.on("TippableCounter:Incremented", async ({ event, context }) => {
const { Event } = context.db;

Event.create({
id: event.log.id,
data: {
transactionHash: event.transaction.hash,
sender: event.args.sender,
incrementedBy: event.args.incrementedBy,
ethSent: event.args.ethSent,
},
});
});

Now run Ponder with yarn dev and it should index the one event from the deploy script. If you send a transaction on the site it should pick that up too.

❯ yarn dev
yarn run v1.22.10
warning ../../package.json: No license field
$ ponder dev
9:20:45 PM INFO server Started listening on port 42069
9:20:46 PM INFO realtime Fetched latest block at 0 (network=anvil)
9:20:46 PM INFO historical Started sync with 100% cached (contract=TippableCounter network=anvil)
9:20:46 PM INFO historical Completed sync (network=anvil)
9:20:46 PM INFO server Started responding as healthy
9:20:46 PM INFO realtime Fetched finalized block at 0 (network=anvil)
9:20:51 PM INFO realtime Synced 1 matched log from block 1 (network=anvil)
9:20:51 PM INFO realtime Synced 1 matched log from block 2 (network=anvil)
1n
9:20:51 PM INFO indexing Indexed 1 event up to Dec 20, 2023 (chainId=31337 block=2 logIndex=0)
Historical sync (complete)

Indexing (up to date)
████████████████████████████████████ 100% | 1/1 events (2 total)

GraphQL
Server live at http://localhost:42069

You can navigate to the GraphQL endpoint or just open up the SQLite database and see that the event has been indexed! Exciting.

Since Ponder is so tightly typed, it infers from the schema and ABI what events exist for the contract, what the types are, as well as the database schema types.

Remember when we were going to keep a running log of what the counter value is? We can also make a contract call at the time of index to get it.

Let’s add counterValue to our schema as a bigint and change our indexing function:

import { ponder } from "@/generated";

ponder.on("TippableCounter:Incremented", async ({ event, context }) => {
const { Event } = context.db;

const counterValue = (await context.client.readContract({
abi: context.contracts.TippableCounter.abi,
address: context.contracts.TippableCounter.address,
functionName: "counter",
})) as bigint;

Event.create({
id: event.log.id,
data: {
transactionHash: event.transaction.hash,
sender: event.args.sender,
incrementedBy: event.args.incrementedBy,
ethSent: event.args.ethSent,
counterValue: counterValue,
},
});
});

Ponder will hot reload, recreate the schema, and reindex the events with counterValue set now. Cool - now let's show this on the frontend.

We’re going to use swr, graphql-request, and graphql to query the GraphQL - add those with pnpm add swr graphql-request graphql. SWR (and React Query - your choice) is able to do on-demand mutation, which we're going to want to do when we send a transaction and want to display the new indexed event.

src/components/IndexedEvents.tsx

"use client";

import { request } from "graphql-request";
import useSWR from "swr";
import { formatUnits } from "viem";

const fetcher = (query: string) =>
request(
"http://127.0.0.1:42069",
`query{
events {
id
transactionHash
sender
incrementedBy
ethSent
counterValue
}
}`
);

type Event = {
id: string;
counterValue: string;
ethSent: string;
incrementedBy: string;
sender: string;
transactionHash: string;
};

export function IndexedEvents() {
const { data, error }: any = useSWR("indexedEvents", fetcher);

return (
<div>
{data?.events?.map((e: Event, idx: number) => (
<div key={idx}>
<p>Sender: {e.sender}</p>
<p>Incremened By: {e.incrementedBy}</p>
<p>Counter Value: {e.counterValue}</p>
<p>Eth Sent: {formatUnits(BigInt(e.ethSent), 18)}</p>
</div>
))}
</div>
);
}

We set the useSWR fetch key as indexedEvents for an easy way to mutate the data. Once on the main page, you should see something like this:

Now when we send a transaction in SendIncrementTransaction, we want to mutate when the transaction has been indexed by Ponder — crucially, there may be a lag time between when the transaction was confirmed and when it is indexed, so we don’t want to jump the gun and mutate too early. We can do this by polling GraphQL with the transaction hash. So lets make a new hook that does this.

import { request } from "graphql-request";
import { useEffect, useState } from "react";
import useSWR, { useSWRConfig } from "swr";

export default function useWaitForIndex({
hash,
onIndexed,
}: {
hash: `0x${string}` | undefined;
onIndexed: Function;
}) {
const [indexed, setIndexed] = useState(false);
useEffect(() => {
const id = setInterval(async () => {
if (!hash || indexed) return;
const res = await request(
"http://127.0.0.1:42069",
`query{
events(where: {transactionHash: "${hash}"}) {
id
}
}`
);
if ((res as any).events.length > 0) {
setIndexed(true);
onIndexed(res);
clearInterval(id);
}
}, 1000);
return () => {
clearInterval(id);
};
}, [hash]);

useEffect(() => {
setIndexed(false);
}, [hash]);
}

Now we can plug this hook in to our SendIncrementTransaction:

useWaitForIndex({
hash: writeData?.hash,
onIndexed() {
mutate("indexedEvents");
},
});

When we send another transaction, once Ponder has indexed the transaction the list of events should revalidate! You can build this hook out to include things like isLoading, isSuccess, isError, maxRetries, etc.

Deploying To Prod

Nextjs projects are very easy to deploy on vercel — just push it all to github and give access to vercel. There are simple instructions on https://vercel.com/dashboard. You can also deploy to platforms like https://fly.io and https://render.com/ — it all depends on your needs.

Ponder recommends deploying on Railway and also has simple instructions for deploying to prod — https://ponder.sh/docs/guides/production. You can also host it on your own infrastructure.

Make sure you update your graphql endpoints, ponder chain configs, wagmi config in the frontend, and you can run the foundry scripts with a different rpc for whatever chain you want to deploy on. https://chainlist.org/ has RPCs for all the major chains.

For Ponder you will most likely need either a full or archive node. Chainstack is good for your own private node, infura and alchemy are also good. There are marginal differences and you’ll find what works for you.

Feel free to follow me on twitter if this was helpful — https://twitter.com/botglen. Cheers!

--

--