books/docs/prd/epic-1-foundation-core-infrastructure.md
Greg fa8acef423 Epic 1, Story 1.1: Project Initialization & Repository Setup
- Initialize Git repository with main branch
- Create comprehensive .gitignore for Node.js, React, and environment files
- Set up directory structure (frontend/, backend/, docs/)
- Create detailed README.md with project overview and setup instructions
- Add .env.example with all required environment variables
- Configure Prettier for consistent code formatting

All acceptance criteria met:
 Git repository initialized with appropriate .gitignore
 Directory structure matches Technical Assumptions
 README.md created with project overview and setup docs
 .env.example file with all required environment variables
 Prettier config files added for code formatting consistency

🤖 Generated with Claude Code (https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-01 15:12:30 +01:00

179 lines
7.4 KiB
Markdown

# Epic 1: Foundation & Core Infrastructure
**Epic Goal:** Set up the complete development environment and project infrastructure including Git repository, frontend/backend scaffolding, database schema, Docker configuration, and a health check endpoint to validate the entire stack works end-to-end before building features.
## Story 1.1: Project Initialization & Repository Setup
As a **developer**,
I want **a properly structured monorepo with frontend and backend directories**,
so that **I have a clean foundation for building the application with all necessary tooling configured**.
**Acceptance Criteria:**
1. Git repository initialized with appropriate `.gitignore` for Node.js, React, and environment files
2. Directory structure matches Technical Assumptions section (frontend/, backend/, docs/)
3. README.md created with project overview, setup instructions, and tech stack documentation
4. `.env.example` file created listing all required environment variables
5. Root-level `package.json` created (if using monorepo workspace tools) or separate package.json files in frontend/ and backend/
6. EditorConfig or Prettier config files added for consistent code formatting
## Story 1.2: Frontend Scaffolding with Vite + React
As a **developer**,
I want **a Vite + React application scaffold with Tailwind CSS configured**,
so that **I can start building UI components with the chosen tech stack**.
**Acceptance Criteria:**
1. Vite project created in `frontend/` directory using `npm create vite@latest`
2. React 18+ configured as the framework
3. Tailwind CSS installed and configured (`tailwind.config.js`, `postcss.config.js`)
4. Base styles imported in main CSS file (`@tailwind base; @tailwind components; @tailwind utilities;`)
5. Sample component renders successfully showing Tailwind styles work
6. ESLint and Prettier configured for React/JavaScript
7. Development server runs successfully on `http://localhost:5173` (or configured port)
8. `package.json` scripts include: `dev`, `build`, `preview`, `lint`
## Story 1.3: Backend Scaffolding with Node.js + Express
As a **developer**,
I want **a Node.js Express API scaffold with basic middleware configured**,
so that **I can start building REST API endpoints**.
**Acceptance Criteria:**
1. Express application created in `backend/src/server.js`
2. Required dependencies installed: `express`, `cors`, `helmet`, `dotenv`, `express-validator`
3. Middleware configured: CORS (allowing frontend origin), Helmet (security headers), JSON body parser
4. Environment configuration loaded via `dotenv` at startup
5. Server listens on port from environment variable (default: 3000)
6. Basic error handling middleware implemented (catches errors, returns JSON response)
7. Health check endpoint created: `GET /api/health` returns `{ status: "ok", timestamp: "..." }`
8. `package.json` scripts include: `dev` (nodemon for auto-reload), `start` (production)
9. ESLint configured for Node.js/Express patterns
## Story 1.4: Database Schema Definition with Prisma
As a **developer**,
I want **a Prisma schema defining Books and ReadingLogs tables**,
so that **I have a clear data model ready for implementation**.
**Acceptance Criteria:**
1. Prisma installed in `backend/` directory (`npm install prisma @prisma/client`)
2. Prisma initialized with PostgreSQL provider (`npx prisma init`)
3. `prisma/schema.prisma` file defines the following models:
**Books Model:**
```prisma
model Book {
id Int @id @default(autoincrement())
title String @db.VarChar(500)
author String? @db.VarChar(500)
totalPages Int
coverUrl String? @db.VarChar(1000)
deadlineDate DateTime @db.Date
isPrimary Boolean @default(false)
status String @default("reading") @db.VarChar(50)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
readingLogs ReadingLog[]
@@index([deadlineDate])
@@index([status])
}
```
**ReadingLogs Model:**
```prisma
model ReadingLog {
id Int @id @default(autoincrement())
bookId Int
logDate DateTime @db.Date
currentPage Int
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
book Book @relation(fields: [bookId], references: [id], onDelete: Cascade)
@@unique([bookId, logDate])
@@index([bookId])
@@index([logDate])
}
```
4. Schema includes appropriate indexes for performance (bookId, logDate, deadlineDate)
5. Relationship defined: Book has many ReadingLogs (one-to-many)
6. Unique constraint on ReadingLog ensures one entry per book per day
## Story 1.5: Database Setup & Initial Migration
As a **developer**,
I want **the database schema migrated to a PostgreSQL database**,
so that **the application can persist data**.
**Acceptance Criteria:**
1. `DATABASE_URL` environment variable configured in `.env` file
2. Prisma migration created: `npx prisma migrate dev --name init`
3. Migration successfully creates `Book` and `ReadingLog` tables in PostgreSQL
4. Prisma Client generated and can be imported in backend code
5. Database connection verified (Prisma Client can connect successfully)
6. Migration files committed to Git (`prisma/migrations/`)
7. Instructions in README.md for running migrations in new environments
## Story 1.6: Docker Configuration for Local Development
As a **developer**,
I want **Docker Compose configuration for running the full stack locally**,
so that **I can develop and test the application in a containerized environment matching production**.
**Acceptance Criteria:**
1. `Dockerfile` created in `frontend/` directory:
- Multi-stage build (build stage + nginx/serve stage for production)
- Development stage uses Vite dev server
- Production stage serves static build files
2. `Dockerfile` created in `backend/` directory:
- Node.js base image (node:20-alpine or similar)
- Installs dependencies and runs Prisma generate
- Exposes API port
- Runs `npm start` command
3. `docker-compose.yml` created in project root defining services:
- **postgres:** PostgreSQL 15+ with volume for data persistence
- **backend:** Builds from backend/Dockerfile, depends on postgres, exposes API port
- **frontend:** Builds from frontend/Dockerfile, exposes dev server port, depends on backend
- Environment variables configured via `.env` file
4. `docker-compose up` successfully starts all three containers
5. Frontend accessible at `http://localhost:5173` (or configured port)
6. Backend API accessible at `http://localhost:3000/api/health`
7. Database accessible on PostgreSQL port with credentials from .env
8. Hot reload works for both frontend and backend in development mode
9. README.md updated with Docker setup and usage instructions
## Story 1.7: CI/CD Deployment Configuration for Coolify
As a **developer**,
I want **deployment configuration ready for Coolify**,
so that **I can deploy the application to production infrastructure**.
**Acceptance Criteria:**
1. Production `docker-compose.yml` or Coolify configuration file created
2. Environment variable template documented for Coolify deployment
3. Build and deployment process documented in `docs/deployment.md`:
- How to configure Coolify project
- Required environment variables
- Database connection setup
- SSL/HTTPS configuration (handled by Coolify)
- Health check endpoints for monitoring
4. Production Dockerfiles optimized (multi-stage builds, minimal image size)
5. Database migration strategy documented (how to run migrations on deployment)
6. Backup and restore procedures documented for PostgreSQL
---