Database Guides
MockHero generates data. These guides show you how to insert it into every popular database and ORM. The API call is the same every time — only the insert logic changes.
The MockHero API Call (same for all guides)
Every guide below starts with the same generate request. The only thing that changes is the format parameter: use "json" for ORMs and drivers, or "sql" for raw SQL.
const response = await fetch("https://mockhero.dev/api/v1/generate", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: "Bearer mh_YOUR_API_KEY",
},
body: JSON.stringify({
template: "ecommerce", // or "blog", "saas", "social"
scale: 1,
format: "json", // use "sql" for raw SQL guides
}),
});
const { data } = await response.json();
// data.customers, data.products, data.orders, etc.1. Supabase (JavaScript)
Supabase wraps PostgreSQL with a REST layer and Row Level Security (RLS). The fastest path to seeding is the service role key, which bypasses RLS entirely.
Prerequisites
npm install @supabase/supabase-js
Complete Seed Script
// scripts/seed.ts
// Run: npx tsx scripts/seed.ts
import { createClient } from "@supabase/supabase-js";
const MOCKHERO_API_KEY = process.env.MOCKHERO_API_KEY!;
const SUPABASE_URL = process.env.SUPABASE_URL!;
const SUPABASE_SERVICE_ROLE_KEY = process.env.SUPABASE_SERVICE_ROLE_KEY!;
// Use the service role key — it bypasses RLS
const supabase = createClient(SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY);
async function main() {
console.log("Generating mock data...");
const res = await fetch("https://mockhero.dev/api/v1/generate", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${MOCKHERO_API_KEY}`,
},
body: JSON.stringify({
template: "ecommerce",
scale: 1,
format: "json",
}),
});
if (!res.ok) throw new Error(`MockHero API error: ${res.status}`);
const { data } = await res.json();
// Insert in dependency order (parents before children)
const tables = ["customers", "products", "orders", "order_items", "reviews"];
for (const table of tables) {
const rows = data[table];
if (!rows || rows.length === 0) continue;
// Supabase has a 1000-row limit per insert — batch if needed
const BATCH_SIZE = 500;
for (let i = 0; i < rows.length; i += BATCH_SIZE) {
const batch = rows.slice(i, i + BATCH_SIZE);
const { error } = await supabase.from(table).insert(batch);
if (error) {
console.error(`Error inserting into ${table}:`, error.message);
process.exit(1);
}
}
console.log(` Inserted ${rows.length} rows into ${table}`);
}
console.log("Done!");
}
main();RLS Gotcha
If you use the anon key instead of the service role key, inserts will silently fail (or return an empty array) because RLS blocks all writes by default. To fix this, either use the service role key for seeding, or add explicit insert policies:
-- Allow the anon role to read (add this if your app reads with anon key) CREATE POLICY "Allow public read" ON customers FOR SELECT USING (true); -- For seeding specifically, prefer the service role key instead -- of opening up insert policies to anon.
Common Gotchas
- RLS: Service role key bypasses RLS. Anon key does not. Always seed with service role.
- Batch size: Supabase limits inserts to ~1000 rows. Batch your inserts.
- FK ordering: Insert parent tables first (customers, products) before children (orders, order_items).
2. Neon (PostgreSQL via @neondatabase/serverless)
Neon is serverless PostgreSQL. You can use their driver for direct SQL inserts, or use MockHero's format: "sql" output to get raw INSERT statements you can execute directly.
Prerequisites
npm install @neondatabase/serverless
Option A: JSON Format with Parameterized Inserts
// scripts/seed.ts
// Run: npx tsx scripts/seed.ts
import { neon } from "@neondatabase/serverless";
const sql = neon(process.env.DATABASE_URL!);
const MOCKHERO_API_KEY = process.env.MOCKHERO_API_KEY!;
async function main() {
console.log("Generating mock data...");
const res = await fetch("https://mockhero.dev/api/v1/generate", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${MOCKHERO_API_KEY}`,
},
body: JSON.stringify({
template: "ecommerce",
scale: 1,
format: "json",
}),
});
if (!res.ok) throw new Error(`MockHero API error: ${res.status}`);
const { data } = await res.json();
// Insert in dependency order
const tables = ["customers", "products", "orders", "order_items", "reviews"];
for (const table of tables) {
const rows = data[table];
if (!rows || rows.length === 0) continue;
const columns = Object.keys(rows[0]);
const colList = columns.map((c) => `"${c}"`).join(", ");
for (const row of rows) {
const values = columns.map((c) => row[c]);
const placeholders = columns.map((_, i) => `$${i + 1}`).join(", ");
await sql(`INSERT INTO ${table} (${colList}) VALUES (${placeholders})`, values);
}
console.log(` Inserted ${rows.length} rows into ${table}`);
}
console.log("Done!");
}
main();Option B: SQL Format (raw INSERT statements)
Use MockHero's SQL output to get ready-to-execute INSERT statements. This is the fastest approach — no mapping needed.
// scripts/seed-sql.ts
// Run: npx tsx scripts/seed-sql.ts
import { neon } from "@neondatabase/serverless";
const sql = neon(process.env.DATABASE_URL!);
const MOCKHERO_API_KEY = process.env.MOCKHERO_API_KEY!;
async function main() {
console.log("Generating SQL...");
const res = await fetch("https://mockhero.dev/api/v1/generate", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${MOCKHERO_API_KEY}`,
},
body: JSON.stringify({
template: "ecommerce",
scale: 1,
format: "sql",
sql_dialect: "postgres",
}),
});
if (!res.ok) throw new Error(`MockHero API error: ${res.status}`);
const { data } = await res.json();
// data.sql is a string of INSERT statements
// Split on semicolons and execute each statement
const statements = data.sql
.split(";")
.map((s: string) => s.trim())
.filter(Boolean);
for (const stmt of statements) {
await sql(stmt);
}
console.log(`Executed ${statements.length} INSERT statements`);
console.log("Done!");
}
main();Common Gotchas
- Connection pooling: The
neon()driver uses HTTP, which is stateless. For large seeds, consider usingPoolfrom@neondatabase/serverlessfor WebSocket connections. - FK ordering: Always insert parent tables before children.
- SQL format: The
sql_dialect: "postgres"option ensures correct quoting and types for Neon/PostgreSQL.
3. Prisma
Prisma's createMany maps directly to MockHero's JSON output. The field names in the response match typical Prisma schemas, so you can pass the data straight through.
Prerequisites
npm install @prisma/client npx prisma generate
Complete Seed Script
// scripts/seed.ts
// Run: npx tsx scripts/seed.ts
import { PrismaClient } from "@prisma/client";
const prisma = new PrismaClient();
const MOCKHERO_API_KEY = process.env.MOCKHERO_API_KEY!;
async function main() {
console.log("Generating mock data...");
const res = await fetch("https://mockhero.dev/api/v1/generate", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${MOCKHERO_API_KEY}`,
},
body: JSON.stringify({
template: "ecommerce",
scale: 1,
format: "json",
}),
});
if (!res.ok) throw new Error(`MockHero API error: ${res.status}`);
const { data } = await res.json();
// Insert in dependency order to satisfy foreign keys
console.log("Seeding customers...");
await prisma.customer.createMany({ data: data.customers });
console.log("Seeding products...");
await prisma.product.createMany({ data: data.products });
console.log("Seeding orders...");
await prisma.order.createMany({
data: data.orders.map((o: Record<string, unknown>) => ({
...o,
// Map snake_case to your Prisma model field names if needed
customerId: o.customer_id,
orderNumber: o.order_number,
orderedAt: o.ordered_at,
paymentMethod: o.payment_method,
})),
});
console.log("Seeding order items...");
await prisma.orderItem.createMany({
data: data.order_items.map((i: Record<string, unknown>) => ({
...i,
orderId: i.order_id,
productId: i.product_id,
unitPrice: i.unit_price,
})),
});
console.log("Seeding reviews...");
await prisma.review.createMany({
data: data.reviews.map((r: Record<string, unknown>) => ({
...r,
customerId: r.customer_id,
productId: r.product_id,
})),
});
console.log("Done!");
}
main()
.catch((e) => {
console.error(e);
process.exit(1);
})
.finally(() => prisma.$disconnect());If Your Schema Uses snake_case
If your Prisma schema already uses snake_case column names (via @map or matching names), you can skip the field mapping and insert directly:
// If your Prisma schema uses snake_case, insert directly:
await prisma.order.createMany({ data: data.orders });Common Gotchas
- FK ordering: Insert parent tables (customers, products) before children (orders, order_items, reviews).
- Field name mapping: MockHero outputs snake_case (e.g.
customer_id). If your Prisma model uses camelCase, map the fields. - createMany limits: Some databases (e.g. SQLite via Prisma) don't support
createMany. Use individualcreatecalls in a loop instead. - Transactions: Wrap everything in
prisma.$transaction()if you want all-or-nothing inserts.
4. Drizzle ORM
Drizzle's .insert().values() accepts arrays of objects, which maps perfectly to MockHero's JSON output.
Prerequisites
npm install drizzle-orm postgres # or with Neon: npm install drizzle-orm @neondatabase/serverless
Complete Seed Script
// scripts/seed.ts
// Run: npx tsx scripts/seed.ts
import { drizzle } from "drizzle-orm/postgres-js";
import postgres from "postgres";
import {
customers,
products,
orders,
orderItems,
reviews,
} from "../src/db/schema"; // your Drizzle schema
const connection = postgres(process.env.DATABASE_URL!);
const db = drizzle(connection);
const MOCKHERO_API_KEY = process.env.MOCKHERO_API_KEY!;
async function main() {
console.log("Generating mock data...");
const res = await fetch("https://mockhero.dev/api/v1/generate", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${MOCKHERO_API_KEY}`,
},
body: JSON.stringify({
template: "ecommerce",
scale: 1,
format: "json",
}),
});
if (!res.ok) throw new Error(`MockHero API error: ${res.status}`);
const { data } = await res.json();
// Insert in dependency order
console.log("Seeding customers...");
await db.insert(customers).values(data.customers);
console.log("Seeding products...");
await db.insert(products).values(data.products);
console.log("Seeding orders...");
await db.insert(orders).values(data.orders);
console.log("Seeding order items...");
await db.insert(orderItems).values(data.order_items);
console.log("Seeding reviews...");
await db.insert(reviews).values(data.reviews);
console.log("Done!");
await connection.end();
}
main();With Neon Serverless Driver
// Replace the connection setup with:
import { neon } from "@neondatabase/serverless";
import { drizzle } from "drizzle-orm/neon-http";
const sql = neon(process.env.DATABASE_URL!);
const db = drizzle(sql);
// The rest of the seed script is identicalCommon Gotchas
- Column name mapping: If your Drizzle schema uses camelCase in TypeScript but snake_case in the database, Drizzle handles the mapping. MockHero outputs snake_case to match the database columns.
- Batch size: For large datasets, PostgreSQL has a limit on the number of parameters per query (~65535). Split large arrays into batches of 500-1000 rows.
- FK ordering: Always insert parent tables first.
5. Firebase Firestore
Firestore is a NoSQL document database. There are no foreign key constraints, but MockHero's reference IDs are still useful for app-level joins. Use batch writes for performance.
Prerequisites
npm install firebase-admin
Complete Seed Script
// scripts/seed.ts
// Run: npx tsx scripts/seed.ts
import { initializeApp, cert } from "firebase-admin/app";
import { getFirestore } from "firebase-admin/firestore";
// Initialize with service account
initializeApp({
credential: cert(require("../service-account.json")),
});
const db = getFirestore();
const MOCKHERO_API_KEY = process.env.MOCKHERO_API_KEY!;
async function main() {
console.log("Generating mock data...");
const res = await fetch("https://mockhero.dev/api/v1/generate", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${MOCKHERO_API_KEY}`,
},
body: JSON.stringify({
template: "ecommerce",
scale: 1,
format: "json",
}),
});
if (!res.ok) throw new Error(`MockHero API error: ${res.status}`);
const { data } = await res.json();
const collections = ["customers", "products", "orders", "order_items", "reviews"];
for (const collection of collections) {
const rows = data[collection];
if (!rows || rows.length === 0) continue;
// Firestore batch writes are limited to 500 operations
const BATCH_SIZE = 500;
for (let i = 0; i < rows.length; i += BATCH_SIZE) {
const batch = db.batch();
const chunk = rows.slice(i, i + BATCH_SIZE);
for (const row of chunk) {
// Use MockHero's id as the document ID
const docRef = db.collection(collection).doc(row.id);
batch.set(docRef, row);
}
await batch.commit();
}
console.log(` Inserted ${rows.length} docs into ${collection}`);
}
console.log("Done!");
}
main();Nesting vs. Flattening
MockHero returns flat relational data with reference IDs (e.g. customer_id). For Firestore, you can either:
- Keep it flat (recommended) — store reference IDs as strings and query with
where("customer_id", "==", id) - Nest as subcollections — store order_items under
orders/{orderId}/items
Common Gotchas
- Batch limit: Firestore batch writes are limited to 500 operations. Split large datasets.
- No FK enforcement: Firestore won't reject invalid references. MockHero ensures all refs are valid, but be careful if you modify the data.
- Document size: Each Firestore document has a 1 MB limit. MockHero rows are well under this.
6. Raw SQL (Any Database)
The universal approach. Use MockHero's format: "sql" with your target dialect and pipe the output directly into your database CLI.
Supported Dialects
sql_dialect: "postgres"— PostgreSQL, Neon, Supabase, CockroachDBsql_dialect: "mysql"— MySQL, MariaDB, PlanetScalesql_dialect: "sqlite"— SQLite, Turso, LibSQL
Generate SQL and Save to File
// scripts/generate-sql.ts
// Run: npx tsx scripts/generate-sql.ts
import { writeFileSync } from "fs";
const MOCKHERO_API_KEY = process.env.MOCKHERO_API_KEY!;
async function main() {
const res = await fetch("https://mockhero.dev/api/v1/generate", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${MOCKHERO_API_KEY}`,
},
body: JSON.stringify({
template: "ecommerce",
scale: 1,
format: "sql",
sql_dialect: "postgres", // or "mysql" or "sqlite"
}),
});
if (!res.ok) throw new Error(`MockHero API error: ${res.status}`);
const { data } = await res.json();
writeFileSync("seed.sql", data.sql);
console.log("Wrote seed.sql");
}
main();Pipe to Database CLI
# PostgreSQL npx tsx scripts/generate-sql.ts && psql $DATABASE_URL < seed.sql # MySQL npx tsx scripts/generate-sql.ts && mysql -u root -p mydb < seed.sql # SQLite npx tsx scripts/generate-sql.ts && sqlite3 dev.db < seed.sql
One-Liner with curl
# Generate and seed in one command (PostgreSQL)
curl -s -X POST https://mockhero.dev/api/v1/generate \
-H "Content-Type: application/json" \
-H "Authorization: Bearer mh_YOUR_API_KEY" \
-d '{"template":"ecommerce","scale":1,"format":"sql","sql_dialect":"postgres"}' \
| jq -r '.data.sql' \
| psql $DATABASE_URLCommon Gotchas
- Dialect mismatch: Make sure
sql_dialectmatches your database. PostgreSQL usesUUIDtypes; MySQL usesVARCHAR(36). - Tables must exist: The SQL output contains INSERT statements only, not CREATE TABLE. Create your tables first.
- FK ordering: MockHero's SQL output is already ordered correctly — parent inserts come before children.
7. MongoDB
MongoDB has no foreign key constraints, but MockHero's reference IDs are still useful for application-level joins via $lookup aggregations.
Prerequisites
npm install mongodb
Complete Seed Script
// scripts/seed.ts
// Run: npx tsx scripts/seed.ts
import { MongoClient } from "mongodb";
const MONGO_URI = process.env.MONGO_URI!; // e.g. "mongodb://localhost:27017"
const DB_NAME = process.env.DB_NAME || "mockhero_dev";
const MOCKHERO_API_KEY = process.env.MOCKHERO_API_KEY!;
async function main() {
const client = new MongoClient(MONGO_URI);
await client.connect();
const db = client.db(DB_NAME);
console.log("Generating mock data...");
const res = await fetch("https://mockhero.dev/api/v1/generate", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${MOCKHERO_API_KEY}`,
},
body: JSON.stringify({
template: "ecommerce",
scale: 1,
format: "json",
}),
});
if (!res.ok) throw new Error(`MockHero API error: ${res.status}`);
const { data } = await res.json();
const collections = ["customers", "products", "orders", "order_items", "reviews"];
for (const name of collections) {
const rows = data[name];
if (!rows || rows.length === 0) continue;
// Map MockHero's "id" to MongoDB's "_id" for cleaner lookups
const docs = rows.map((row: Record<string, unknown>) => ({
_id: row.id,
...row,
}));
// insertMany handles batching internally
const result = await db.collection(name).insertMany(docs);
console.log(` Inserted ${result.insertedCount} docs into ${name}`);
}
console.log("Done!");
await client.close();
}
main();Common Gotchas
- No FK constraints: MongoDB won't validate that
customer_idin an order matches an actual customer. MockHero ensures all refs are valid at generation time. - _id mapping: Consider mapping MockHero's
idfield to_idfor idiomatic MongoDB usage. - insertMany limit: MongoDB's default max message size is 16 MB. For very large datasets (scale > 50), batch your inserts.
- Indexes: Create indexes on reference fields (e.g.
customer_id) for efficient$lookupqueries.
Summary
The pattern is always the same:
- Call MockHero's
/api/v1/generatewith your template and format. - Insert the data using your database's native driver or ORM.
- Respect dependency order: parent tables first, then children.
- Batch large datasets to stay within driver/database limits.
For the fastest approach with any SQL database, use format: "sql" and pipe the output directly into your database CLI. For ORMs and document databases, use format: "json" and insert the data programmatically.