A file storage service implementing multipart upload with AWS S3, real-time processing via SQS, and PostgreSQL metadata management.
- Client requests presigned URLs for multipart upload
- File is split into 5MB chunks and uploaded directly to S3
- Each chunk completion is recorded in PostgreSQL
- After all chunks complete, multipart upload is finalized
- S3 triggers ObjectCreated event to SQS
- Background worker polls SQS and updates file status to UPLOADED
Backend
- Node.js with Express
- TypeScript
- Prisma ORM with PostgreSQL
- AWS SDK v3 (S3, SQS)
- Redis for upload session management
- JWT authentication with RS256
Frontend
- TypeScript
- Vite build tool
- Native fetch API for uploads
Infrastructure
- AWS S3 for object storage
- AWS SQS for event processing
- PostgreSQL for metadata
- Redis for caching
User {
id, email, name, passwordHash, createdAt
}
FileMetadata {
fileId, fileName, mimeType, size, s3Key, status, userId, createdAt
status: UPLOADING | UPLOADED | FAILED
}
Chunk {
id, fileId, chunkIndex, size, s3Key, checksum, status, createdAt
status: PENDING | COMPLETED | FAILED
}- Node.js >= 18
- PostgreSQL database
- Redis instance
- AWS account with S3 and SQS configured
Create backend/.env:
# AWS Configuration
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_REGION=your_region
S3_BUCKET=your_bucket_name
SQS_QUEUE_URL=your_sqs_queue_url
# Database
CLOUD_DB_URI=postgresql://user:password@host:port/database
# Redis
CLOUD_RD_URI=redis://host:port
# JWT Keys (RS256)
PRIVATE_KEY=your_private_key
PUBLIC_KEY=your_public_key
# Optional: OAuth
GOOGLE_CLIENT_ID=your_client_id
GOOGLE_CLIENT_SECRET=your_client_secret# Backend
cd backend
npm install
npx prisma generate
npx prisma migrate deploy
npm run dev
# Frontend
cd frontend
npm install
npm run devPOST /api/auth/signup- User registrationPOST /api/auth/signin- User login
POST /api/files/get-upload-urls- Initialize multipart uploadPOST /api/files/record-chunk- Record chunk metadataPOST /api/files/complete-upload- Finalize multipart uploadPOST /api/files/get-download-url- Generate download URL
- Chunked file upload (5MB chunks, max 1GB files)
- Direct-to-S3 uploads using presigned URLs
- Automatic retry logic with exponential backoff
- Upload cancellation support
- Redis-based upload session management (24hr TTL)
- Background SQS polling for S3 events
- JWT-based authentication with HTTP-only cookies
- Comprehensive error handling and logging
- Chunk size: 5MB
- Max file size: 1GB
- Presigned URL expiry: 1 hour
- JWT token expiry: 15 minutes
- Redis TTL: 24 hours
- SQS polling interval: 20 seconds
# Backend
npm run dev # Start dev server with hot reload
npm run build # Compile TypeScript
npm start # Run production build
# Frontend
npm run dev # Start dev server
npm run build # Production build
npm run preview # Preview production build- All passwords are hashed using bcrypt (12 rounds)
- JWT tokens use RS256 asymmetric encryption
- Cookies are HTTP-only with secure flag in production
- S3 uploads use time-limited presigned URLs
- Database queries use Prisma's parameterized statements
- Single file upload only (no concurrent uploads)
- Maximum file size: 1GB
- Chunk size fixed at 5MB
- No resumable uploads on session loss
- Upload metadata cleanup requires manual intervention on failure
