Skip to Content
⭐ CraftJS is open source. Star on GitHub →
DocsStorage

Storage

CraftJS uses Cloudflare R2  for S3-compatible object storage with zero egress fees.

Features

  • S3 Compatible - Use existing S3 SDKs
  • Zero Egress Fees - No charges for data transfer out
  • Global CDN - Fast access worldwide
  • Signed URLs - Secure private file access
  • Direct Uploads - Upload from client without server

Configuration

Environment Variables

R2_ACCOUNT_ID="your-account-id" R2_ACCESS_KEY_ID="your-access-key" R2_SECRET_ACCESS_KEY="your-secret-key" R2_BUCKET_NAME="your-bucket-name" R2_PUBLIC_URL="https://pub-xxx.r2.dev"

Setup Steps

Create R2 Bucket

  1. Go to Cloudflare Dashboard 
  2. Navigate to R2 > Create bucket
  3. Name your bucket (e.g., myapp-uploads)

Create API Token

  1. Go to R2 > Manage R2 API Tokens
  2. Create new token with “Object Read & Write” permissions
  3. Copy the Access Key ID and Secret Access Key

Configure Public Access (Optional)

For public files:

  1. Go to your bucket settings
  2. Enable “Public Access”
  3. Copy the public URL

Storage Client

Configure the R2 client in src/lib/storage/r2.ts:

import { S3Client } from "@aws-sdk/client-s3"; import { env } from "@/env"; export const r2 = new S3Client({ region: "auto", endpoint: `https://${env.R2_ACCOUNT_ID}.r2.cloudflarestorage.com`, credentials: { accessKeyId: env.R2_ACCESS_KEY_ID, secretAccessKey: env.R2_SECRET_ACCESS_KEY, }, }); export const bucketName = env.R2_BUCKET_NAME; export const publicUrl = env.R2_PUBLIC_URL;

Basic Operations

Upload File

import { PutObjectCommand } from "@aws-sdk/client-s3"; import { r2, bucketName } from "@/lib/storage/r2"; export async function uploadFile( key: string, body: Buffer | Uint8Array | string, contentType: string ) { await r2.send( new PutObjectCommand({ Bucket: bucketName, Key: key, Body: body, ContentType: contentType, }) ); return `${publicUrl}/${key}`; }

Download File

import { GetObjectCommand } from "@aws-sdk/client-s3"; import { r2, bucketName } from "@/lib/storage/r2"; export async function downloadFile(key: string) { const response = await r2.send( new GetObjectCommand({ Bucket: bucketName, Key: key, }) ); return response.Body?.transformToByteArray(); }

Delete File

import { DeleteObjectCommand } from "@aws-sdk/client-s3"; import { r2, bucketName } from "@/lib/storage/r2"; export async function deleteFile(key: string) { await r2.send( new DeleteObjectCommand({ Bucket: bucketName, Key: key, }) ); }

List Files

import { ListObjectsV2Command } from "@aws-sdk/client-s3"; import { r2, bucketName } from "@/lib/storage/r2"; export async function listFiles(prefix?: string) { const response = await r2.send( new ListObjectsV2Command({ Bucket: bucketName, Prefix: prefix, }) ); return ( response.Contents?.map((obj) => ({ key: obj.Key, size: obj.Size, lastModified: obj.LastModified, })) ?? [] ); }

Signed URLs

Pre-signed Upload URL

Allow clients to upload directly to R2:

import { PutObjectCommand } from "@aws-sdk/client-s3"; import { getSignedUrl } from "@aws-sdk/s3-request-presigner"; import { r2, bucketName } from "@/lib/storage/r2"; export async function getUploadUrl(key: string, contentType: string) { const command = new PutObjectCommand({ Bucket: bucketName, Key: key, ContentType: contentType, }); const url = await getSignedUrl(r2, command, { expiresIn: 3600, // 1 hour }); return url; }

Pre-signed Download URL

For private files:

import { GetObjectCommand } from "@aws-sdk/client-s3"; import { getSignedUrl } from "@aws-sdk/s3-request-presigner"; import { r2, bucketName } from "@/lib/storage/r2"; export async function getDownloadUrl(key: string) { const command = new GetObjectCommand({ Bucket: bucketName, Key: key, }); const url = await getSignedUrl(r2, command, { expiresIn: 3600, // 1 hour }); return url; }

API Routes

Upload Endpoint

// app/api/upload/route.ts import { NextResponse } from "next/server"; import { auth } from "@/lib/auth/server"; import { getUploadUrl, publicUrl } from "@/lib/storage/r2"; import { headers } from "next/headers"; export async function POST(req: Request) { const session = await auth.api.getSession({ headers: await headers(), }); if (!session) { return NextResponse.json({ error: "Unauthorized" }, { status: 401 }); } const { filename, contentType } = await req.json(); // Generate unique key const key = `uploads/${session.user.id}/${Date.now()}-${filename}`; const uploadUrl = await getUploadUrl(key, contentType); return NextResponse.json({ uploadUrl, key, publicUrl: `${publicUrl}/${key}`, }); }

Direct File Upload

// app/api/files/route.ts import { NextResponse } from "next/server"; import { auth } from "@/lib/auth/server"; import { uploadFile } from "@/lib/storage/r2"; import { headers } from "next/headers"; export async function POST(req: Request) { const session = await auth.api.getSession({ headers: await headers(), }); if (!session) { return NextResponse.json({ error: "Unauthorized" }, { status: 401 }); } const formData = await req.formData(); const file = formData.get("file") as File; if (!file) { return NextResponse.json({ error: "No file provided" }, { status: 400 }); } const bytes = await file.arrayBuffer(); const buffer = Buffer.from(bytes); const key = `uploads/${session.user.id}/${Date.now()}-${file.name}`; const url = await uploadFile(key, buffer, file.type); return NextResponse.json({ url, key }); }

Client-Side Upload

Using Pre-signed URLs

"use client"; import { useState } from "react"; export function FileUploader() { const [uploading, setUploading] = useState(false); const handleUpload = async (file: File) => { setUploading(true); try { // Get pre-signed URL const response = await fetch("/api/upload", { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify({ filename: file.name, contentType: file.type, }), }); const { uploadUrl, publicUrl } = await response.json(); // Upload directly to R2 await fetch(uploadUrl, { method: "PUT", body: file, headers: { "Content-Type": file.type, }, }); console.log("File uploaded:", publicUrl); return publicUrl; } finally { setUploading(false); } }; return ( <input type="file" onChange={(e) => { const file = e.target.files?.[0]; if (file) handleUpload(file); }} disabled={uploading} /> ); }

With Progress Tracking

"use client"; import { useState } from "react"; export function FileUploaderWithProgress() { const [progress, setProgress] = useState(0); const handleUpload = async (file: File) => { const { uploadUrl } = await getUploadUrl(file); return new Promise((resolve, reject) => { const xhr = new XMLHttpRequest(); xhr.upload.addEventListener("progress", (e) => { if (e.lengthComputable) { setProgress(Math.round((e.loaded / e.total) * 100)); } }); xhr.addEventListener("load", () => { if (xhr.status === 200) { resolve(xhr.response); } else { reject(new Error("Upload failed")); } }); xhr.open("PUT", uploadUrl); xhr.setRequestHeader("Content-Type", file.type); xhr.send(file); }); }; return ( <div> <input type="file" onChange={(e) => handleUpload(e.target.files?.[0]!)} /> {progress > 0 && <progress value={progress} max="100" />} </div> ); }

Image Processing

Resize Images Before Upload

export async function resizeImage(file: File, maxWidth: number, maxHeight: number): Promise<Blob> { return new Promise((resolve) => { const img = new Image(); const canvas = document.createElement("canvas"); const ctx = canvas.getContext("2d")!; img.onload = () => { let { width, height } = img; if (width > maxWidth) { height = (height * maxWidth) / width; width = maxWidth; } if (height > maxHeight) { width = (width * maxHeight) / height; height = maxHeight; } canvas.width = width; canvas.height = height; ctx.drawImage(img, 0, 0, width, height); canvas.toBlob((blob) => resolve(blob!), file.type, 0.9); }; img.src = URL.createObjectURL(file); }); }

User Storage Quotas

Track and enforce storage limits:

// lib/storage/quota.ts import { db } from "@/lib/db/client"; import { files, users } from "@/lib/db/schema"; import { eq, sum } from "drizzle-orm"; import { plans } from "@/lib/payments/plans"; export async function getUserStorageUsage(userId: string) { const [result] = await db .select({ total: sum(files.size) }) .from(files) .where(eq(files.userId, userId)); return Number(result?.total ?? 0); } export async function canUpload(userId: string, fileSize: number) { const user = await db.query.users.findFirst({ where: eq(users.id, userId), }); const plan = plans[user?.plan ?? "free"]; const usage = await getUserStorageUsage(userId); return usage + fileSize <= plan.limits.storage; }

Best Practices

Tips for production storage:
  1. Validate file types - Check MIME types before upload
  2. Limit file sizes - Enforce maximum file sizes
  3. Generate unique keys - Prevent overwrites
  4. Use folders - Organize files by user/type
  5. Track in database - Store file metadata
  6. Clean up - Delete files when users delete accounts

Next Steps

Last updated on