mirror of
https://github.com/kjanat/livedash-node.git
synced 2026-01-16 12:12:09 +01:00
fix: resolved biome errors
This commit is contained in:
265
CLAUDE.md
265
CLAUDE.md
@ -6,57 +6,57 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
|
|||||||
|
|
||||||
**Core Development:**
|
**Core Development:**
|
||||||
|
|
||||||
- `pnpm dev` - Start development server (runs custom server.ts with schedulers)
|
- `pnpm dev` - Start development server (runs custom server.ts with schedulers)
|
||||||
- `pnpm dev:next-only` - Start Next.js only with Turbopack (no schedulers)
|
- `pnpm dev:next-only` - Start Next.js only with Turbopack (no schedulers)
|
||||||
- `pnpm build` - Build production application
|
- `pnpm build` - Build production application
|
||||||
- `pnpm start` - Run production server
|
- `pnpm start` - Run production server
|
||||||
|
|
||||||
**Code Quality:**
|
**Code Quality:**
|
||||||
|
|
||||||
- `pnpm lint` - Run ESLint
|
- `pnpm lint` - Run ESLint
|
||||||
- `pnpm lint:fix` - Fix ESLint issues automatically
|
- `pnpm lint:fix` - Fix ESLint issues automatically
|
||||||
- `pnpm format` - Format code with Prettier
|
- `pnpm format` - Format code with Prettier
|
||||||
- `pnpm format:check` - Check formatting without fixing
|
- `pnpm format:check` - Check formatting without fixing
|
||||||
|
|
||||||
**Database:**
|
**Database:**
|
||||||
|
|
||||||
- `pnpm prisma:generate` - Generate Prisma client
|
- `pnpm prisma:generate` - Generate Prisma client
|
||||||
- `pnpm prisma:migrate` - Run database migrations
|
- `pnpm prisma:migrate` - Run database migrations
|
||||||
- `pnpm prisma:push` - Push schema changes to database
|
- `pnpm prisma:push` - Push schema changes to database
|
||||||
- `pnpm prisma:push:force` - Force reset database and push schema
|
- `pnpm prisma:push:force` - Force reset database and push schema
|
||||||
- `pnpm prisma:seed` - Seed database with initial data
|
- `pnpm prisma:seed` - Seed database with initial data
|
||||||
- `pnpm prisma:studio` - Open Prisma Studio database viewer
|
- `pnpm prisma:studio` - Open Prisma Studio database viewer
|
||||||
|
|
||||||
**Testing:**
|
**Testing:**
|
||||||
|
|
||||||
- `pnpm test` - Run both Vitest and Playwright tests concurrently
|
- `pnpm test` - Run both Vitest and Playwright tests concurrently
|
||||||
- `pnpm test:vitest` - Run Vitest tests only
|
- `pnpm test:vitest` - Run Vitest tests only
|
||||||
- `pnpm test:vitest:watch` - Run Vitest in watch mode
|
- `pnpm test:vitest:watch` - Run Vitest in watch mode
|
||||||
- `pnpm test:vitest:coverage` - Run Vitest with coverage report
|
- `pnpm test:vitest:coverage` - Run Vitest with coverage report
|
||||||
- `pnpm test:coverage` - Run all tests with coverage
|
- `pnpm test:coverage` - Run all tests with coverage
|
||||||
|
|
||||||
**Security Testing:**
|
**Security Testing:**
|
||||||
|
|
||||||
- `pnpm test:security` - Run security-specific tests
|
- `pnpm test:security` - Run security-specific tests
|
||||||
- `pnpm test:security-headers` - Test HTTP security headers implementation
|
- `pnpm test:security-headers` - Test HTTP security headers implementation
|
||||||
- `pnpm test:csp` - Test CSP implementation and nonce generation
|
- `pnpm test:csp` - Test CSP implementation and nonce generation
|
||||||
- `pnpm test:csp:validate` - Validate CSP implementation with security scoring
|
- `pnpm test:csp:validate` - Validate CSP implementation with security scoring
|
||||||
- `pnpm test:csp:full` - Comprehensive CSP test suite
|
- `pnpm test:csp:full` - Comprehensive CSP test suite
|
||||||
|
|
||||||
**Migration & Deployment:**
|
**Migration & Deployment:**
|
||||||
|
|
||||||
- `pnpm migration:backup` - Create database backup
|
- `pnpm migration:backup` - Create database backup
|
||||||
- `pnpm migration:validate-db` - Validate database schema and integrity
|
- `pnpm migration:validate-db` - Validate database schema and integrity
|
||||||
- `pnpm migration:validate-env` - Validate environment configuration
|
- `pnpm migration:validate-env` - Validate environment configuration
|
||||||
- `pnpm migration:pre-check` - Run pre-deployment validation checks
|
- `pnpm migration:pre-check` - Run pre-deployment validation checks
|
||||||
- `pnpm migration:health-check` - Run system health checks
|
- `pnpm migration:health-check` - Run system health checks
|
||||||
- `pnpm migration:deploy` - Execute full deployment process
|
- `pnpm migration:deploy` - Execute full deployment process
|
||||||
- `pnpm migration:rollback` - Rollback failed migration
|
- `pnpm migration:rollback` - Rollback failed migration
|
||||||
|
|
||||||
**Markdown:**
|
**Markdown:**
|
||||||
|
|
||||||
- `pnpm lint:md` - Lint Markdown files
|
- `pnpm lint:md` - Lint Markdown files
|
||||||
- `pnpm lint:md:fix` - Fix Markdown linting issues
|
- `pnpm lint:md:fix` - Fix Markdown linting issues
|
||||||
|
|
||||||
## Architecture Overview
|
## Architecture Overview
|
||||||
|
|
||||||
@ -64,138 +64,155 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
|
|||||||
|
|
||||||
### Tech Stack
|
### Tech Stack
|
||||||
|
|
||||||
- **Frontend:** Next.js 15 + React 19 + TailwindCSS 4
|
- **Frontend:** Next.js 15 + React 19 + TailwindCSS 4
|
||||||
- **Backend:** Next.js API Routes + Custom Node.js server
|
- **Backend:** Next.js API Routes + Custom Node.js server
|
||||||
- **Database:** PostgreSQL with Prisma ORM
|
- **Database:** PostgreSQL with Prisma ORM
|
||||||
- **Authentication:** NextAuth.js
|
- **Authentication:** NextAuth.js
|
||||||
- **AI Processing:** OpenAI API integration
|
- **AI Processing:** OpenAI API integration
|
||||||
- **Visualization:** D3.js, React Leaflet, Recharts
|
- **Visualization:** D3.js, React Leaflet, Recharts
|
||||||
- **Scheduling:** Node-cron for background processing
|
- **Scheduling:** Node-cron for background processing
|
||||||
|
|
||||||
### Key Architecture Components
|
### Key Architecture Components
|
||||||
|
|
||||||
**1. Multi-Stage Processing Pipeline**
|
**1. Multi-Stage Processing Pipeline**
|
||||||
The system processes user sessions through distinct stages tracked in `SessionProcessingStatus`:
|
The system processes user sessions through distinct stages tracked in `SessionProcessingStatus`:
|
||||||
|
|
||||||
- `CSV_IMPORT` - Import raw CSV data into `SessionImport`
|
- `CSV_IMPORT` - Import raw CSV data into `SessionImport`
|
||||||
- `TRANSCRIPT_FETCH` - Fetch transcript content from URLs
|
- `TRANSCRIPT_FETCH` - Fetch transcript content from URLs
|
||||||
- `SESSION_CREATION` - Create normalized `Session` and `Message` records
|
- `SESSION_CREATION` - Create normalized `Session` and `Message` records
|
||||||
- `AI_ANALYSIS` - AI processing for sentiment, categorization, summaries
|
- `AI_ANALYSIS` - AI processing for sentiment, categorization, summaries
|
||||||
- `QUESTION_EXTRACTION` - Extract questions from conversations
|
- `QUESTION_EXTRACTION` - Extract questions from conversations
|
||||||
|
|
||||||
**2. Database Architecture**
|
**2. Database Architecture**
|
||||||
|
|
||||||
- **Multi-tenant design** with `Company` as root entity
|
- **Multi-tenant design** with `Company` as root entity
|
||||||
- **Dual storage pattern**: Raw CSV data in `SessionImport`, processed data in `Session`
|
- **Dual storage pattern**: Raw CSV data in `SessionImport`, processed data in `Session`
|
||||||
- **1-to-1 relationship** between `SessionImport` and `Session` via `importId`
|
- **1-to-1 relationship** between `SessionImport` and `Session` via `importId`
|
||||||
- **Message parsing** into individual `Message` records with order tracking
|
- **Message parsing** into individual `Message` records with order tracking
|
||||||
- **AI cost tracking** via `AIProcessingRequest` with detailed token usage
|
- **AI cost tracking** via `AIProcessingRequest` with detailed token usage
|
||||||
- **Flexible AI model management** through `AIModel`, `AIModelPricing`, and `CompanyAIModel`
|
- **Flexible AI model management** through `AIModel`, `AIModelPricing`, and `CompanyAIModel`
|
||||||
|
|
||||||
**3. Custom Server Architecture**
|
**3. Custom Server Architecture**
|
||||||
|
|
||||||
- `server.ts` - Custom Next.js server with configurable scheduler initialization
|
- `server.ts` - Custom Next.js server with configurable scheduler initialization
|
||||||
- Three main schedulers: CSV import, import processing, and session processing
|
- Three main schedulers: CSV import, import processing, and session processing
|
||||||
- Environment-based configuration via `lib/env.ts`
|
- Environment-based configuration via `lib/env.ts`
|
||||||
|
|
||||||
**4. Key Processing Libraries**
|
**4. Key Processing Libraries**
|
||||||
|
|
||||||
- `lib/scheduler.ts` - CSV import scheduling
|
- `lib/scheduler.ts` - CSV import scheduling
|
||||||
- `lib/importProcessor.ts` - Raw data to Session conversion
|
- `lib/importProcessor.ts` - Raw data to Session conversion
|
||||||
- `lib/processingScheduler.ts` - AI analysis pipeline
|
- `lib/processingScheduler.ts` - AI analysis pipeline
|
||||||
- `lib/transcriptFetcher.ts` - External transcript fetching
|
- `lib/transcriptFetcher.ts` - External transcript fetching
|
||||||
- `lib/transcriptParser.ts` - Message parsing from transcripts
|
- `lib/transcriptParser.ts` - Message parsing from transcripts
|
||||||
- `lib/batchProcessor.ts` - OpenAI Batch API integration for cost-efficient processing
|
- `lib/batchProcessor.ts` - OpenAI Batch API integration for cost-efficient processing
|
||||||
- `lib/batchScheduler.ts` - Automated batch job lifecycle management
|
- `lib/batchScheduler.ts` - Automated batch job lifecycle management
|
||||||
- `lib/rateLimiter.ts` - In-memory rate limiting utility for API endpoints
|
- `lib/rateLimiter.ts` - In-memory rate limiting utility for API endpoints
|
||||||
|
|
||||||
### Development Environment
|
### Development Environment
|
||||||
|
|
||||||
**Environment Configuration:**
|
**Environment Configuration:**
|
||||||
Environment variables are managed through `lib/env.ts` with .env.local file support:
|
Environment variables are managed through `lib/env.ts` with .env.local file support:
|
||||||
|
|
||||||
- Database: PostgreSQL via `DATABASE_URL` and `DATABASE_URL_DIRECT`
|
- Database: PostgreSQL via `DATABASE_URL` and `DATABASE_URL_DIRECT`
|
||||||
- Authentication: `NEXTAUTH_SECRET`, `NEXTAUTH_URL`
|
- Authentication: `NEXTAUTH_SECRET`, `NEXTAUTH_URL`
|
||||||
- AI Processing: `OPENAI_API_KEY`
|
- AI Processing: `OPENAI_API_KEY`
|
||||||
- Schedulers: `SCHEDULER_ENABLED`, various interval configurations
|
- Schedulers: `SCHEDULER_ENABLED`, various interval configurations
|
||||||
|
|
||||||
**Key Files to Understand:**
|
**Key Files to Understand:**
|
||||||
|
|
||||||
- `prisma/schema.prisma` - Complete database schema with enums and relationships
|
- `prisma/schema.prisma` - Complete database schema with enums and relationships
|
||||||
- `server.ts` - Custom server entry point
|
- `server.ts` - Custom server entry point
|
||||||
- `lib/env.ts` - Environment variable management and validation
|
- `lib/env.ts` - Environment variable management and validation
|
||||||
- `app/` - Next.js App Router structure
|
- `app/` - Next.js App Router structure
|
||||||
|
|
||||||
**Testing:**
|
**Testing:**
|
||||||
|
|
||||||
- Uses Vitest for unit testing
|
- Uses Vitest for unit testing
|
||||||
- Playwright for E2E testing
|
- Playwright for E2E testing
|
||||||
- Test files in `tests/` directory
|
- Test files in `tests/` directory
|
||||||
|
|
||||||
### Important Notes
|
### Important Notes
|
||||||
|
|
||||||
**Scheduler System:**
|
**Scheduler System:**
|
||||||
|
|
||||||
- Schedulers are optional and controlled by `SCHEDULER_ENABLED` environment variable
|
- Schedulers are optional and controlled by `SCHEDULER_ENABLED` environment variable
|
||||||
- Use `pnpm dev:next-only` to run without schedulers for pure frontend development
|
- Use `pnpm dev:next-only` to run without schedulers for pure frontend development
|
||||||
- Four separate schedulers handle different pipeline stages:
|
- Four separate schedulers handle different pipeline stages:
|
||||||
- CSV Import Scheduler (`lib/scheduler.ts`)
|
- CSV Import Scheduler (`lib/scheduler.ts`)
|
||||||
- Import Processing Scheduler (`lib/importProcessor.ts`)
|
- Import Processing Scheduler (`lib/importProcessor.ts`)
|
||||||
- Session Processing Scheduler (`lib/processingScheduler.ts`)
|
- Session Processing Scheduler (`lib/processingScheduler.ts`)
|
||||||
- Batch Processing Scheduler (`lib/batchScheduler.ts`) - Manages OpenAI Batch API lifecycle
|
- Batch Processing Scheduler (`lib/batchScheduler.ts`) - Manages OpenAI Batch API lifecycle
|
||||||
|
|
||||||
**Database Migrations:**
|
**Database Migrations:**
|
||||||
|
|
||||||
- Always run `pnpm prisma:generate` after schema changes
|
- Always run `pnpm prisma:generate` after schema changes
|
||||||
- Use `pnpm prisma:migrate` for production-ready migrations
|
- Use `pnpm prisma:migrate` for production-ready migrations
|
||||||
- Use `pnpm prisma:push` for development schema changes
|
- Use `pnpm prisma:push` for development schema changes
|
||||||
- Database uses PostgreSQL with Prisma's driver adapter for connection pooling
|
- Database uses PostgreSQL with Prisma's driver adapter for connection pooling
|
||||||
|
|
||||||
**AI Processing:**
|
**AI Processing:**
|
||||||
|
|
||||||
- All AI requests are tracked for cost analysis
|
- All AI requests are tracked for cost analysis
|
||||||
- Support for multiple AI models per company
|
- Support for multiple AI models per company
|
||||||
- Time-based pricing management for accurate cost calculation
|
- Time-based pricing management for accurate cost calculation
|
||||||
- Processing stages can be retried on failure with retry count tracking
|
- Processing stages can be retried on failure with retry count tracking
|
||||||
- **Batch API Integration**: 50% cost reduction using OpenAI Batch API
|
- **Batch API Integration**: 50% cost reduction using OpenAI Batch API
|
||||||
- Automatic batching of AI requests every 5 minutes
|
- Automatic batching of AI requests every 5 minutes
|
||||||
- Batch status checking every 2 minutes
|
- Batch status checking every 2 minutes
|
||||||
- Result processing every minute
|
- Result processing every minute
|
||||||
- Failed request retry with individual API calls
|
- Failed request retry with individual API calls
|
||||||
|
|
||||||
**Code Quality Standards:**
|
**Code Quality Standards:**
|
||||||
|
|
||||||
- Run `pnpm lint` and `pnpm format:check` before committing
|
- Run `pnpm lint` and `pnpm format:check` before committing
|
||||||
- TypeScript with ES modules (type: "module" in package.json)
|
- TypeScript with ES modules (type: "module" in package.json)
|
||||||
- React 19 with Next.js 15 App Router
|
- React 19 with Next.js 15 App Router
|
||||||
- TailwindCSS 4 for styling
|
- TailwindCSS 4 for styling
|
||||||
|
|
||||||
**Security Features:**
|
**Security Features:**
|
||||||
|
|
||||||
- **Comprehensive CSRF Protection**: Multi-layer CSRF protection with automatic token management
|
- **Comprehensive CSRF Protection**: Multi-layer CSRF protection with automatic token management
|
||||||
- Middleware-level protection for all state-changing endpoints
|
- Middleware-level protection for all state-changing endpoints
|
||||||
- tRPC integration with CSRF-protected procedures
|
- tRPC integration with CSRF-protected procedures
|
||||||
- Client-side hooks and components for seamless integration
|
- Client-side hooks and components for seamless integration
|
||||||
- HTTP-only cookies with SameSite protection
|
- HTTP-only cookies with SameSite protection
|
||||||
- **Enhanced Content Security Policy (CSP)**:
|
- **Enhanced Content Security Policy (CSP)**:
|
||||||
- Nonce-based script execution for maximum XSS protection
|
- Nonce-based script execution for maximum XSS protection
|
||||||
- Environment-specific policies (strict production, permissive development)
|
- Environment-specific policies (strict production, permissive development)
|
||||||
- Real-time violation reporting and bypass detection
|
- Real-time violation reporting and bypass detection
|
||||||
- Automated policy optimization recommendations
|
- Automated policy optimization recommendations
|
||||||
- **Security Monitoring & Audit System**:
|
- **Security Monitoring & Audit System**:
|
||||||
- Real-time threat detection and alerting
|
- Real-time threat detection and alerting
|
||||||
- Comprehensive security audit logging with retention management
|
- Comprehensive security audit logging with retention management
|
||||||
- Geographic anomaly detection and IP threat analysis
|
- Geographic anomaly detection and IP threat analysis
|
||||||
- Security scoring and automated incident response
|
- Security scoring and automated incident response
|
||||||
- **Advanced Rate Limiting**: In-memory rate limiting system
|
- **Advanced Rate Limiting**: In-memory rate limiting system
|
||||||
- Authentication endpoints: Login (5/15min), Registration (3/hour), Password Reset (5/15min)
|
- Authentication endpoints: Login (5/15min), Registration (3/hour), Password Reset (5/15min)
|
||||||
- CSP reporting: 10 reports per minute per IP
|
- CSP reporting: 10 reports per minute per IP
|
||||||
- Admin endpoints: Configurable thresholds
|
- Admin endpoints: Configurable thresholds
|
||||||
- **Input Validation & Security Headers**:
|
- **Input Validation & Security Headers**:
|
||||||
- Comprehensive Zod schemas for all user inputs with XSS/injection prevention
|
- Comprehensive Zod schemas for all user inputs with XSS/injection prevention
|
||||||
- HTTP security headers (HSTS, X-Frame-Options, X-Content-Type-Options, Permissions Policy)
|
- HTTP security headers (HSTS, X-Frame-Options, X-Content-Type-Options, Permissions Policy)
|
||||||
- Strong password requirements and email validation
|
- Strong password requirements and email validation
|
||||||
- **Session Security**:
|
- **Session Security**:
|
||||||
- JWT tokens with 24-hour expiration and secure cookie settings
|
- JWT tokens with 24-hour expiration and secure cookie settings
|
||||||
- HttpOnly, Secure, SameSite cookies with proper CSP integration
|
- HttpOnly, Secure, SameSite cookies with proper CSP integration
|
||||||
- Company isolation and multi-tenant security
|
- Company isolation and multi-tenant security
|
||||||
|
|
||||||
|
**Code Quality & Linting:**
|
||||||
|
|
||||||
|
- **Biome Integration**: Primary linting and formatting tool
|
||||||
|
- Pre-commit hooks enforce code quality standards
|
||||||
|
- Some security-critical patterns require `biome-ignore` comments
|
||||||
|
- Non-null assertions (`!`) used intentionally in authenticated contexts require ignore comments
|
||||||
|
- Complex functions may need refactoring to meet complexity thresholds (max 15)
|
||||||
|
- Performance classes use static-only patterns which may trigger warnings
|
||||||
|
- **TypeScript Strict Mode**: Comprehensive type checking
|
||||||
|
- Avoid `any` types where possible; use proper type definitions
|
||||||
|
- Optional chaining vs non-null assertions: choose based on security context
|
||||||
|
- In authenticated API handlers, non-null assertions are often safer than optional chaining
|
||||||
|
- **Security vs Linting Balance**:
|
||||||
|
- Security takes precedence over linting rules when they conflict
|
||||||
|
- Document security-critical choices with detailed comments
|
||||||
|
- Use `// biome-ignore` with explanations for intentional rule violations
|
||||||
|
|||||||
@ -11,32 +11,37 @@ This document summarizes the comprehensive documentation audit performed on the
|
|||||||
The following areas were found to have comprehensive, accurate documentation:
|
The following areas were found to have comprehensive, accurate documentation:
|
||||||
|
|
||||||
1. **CSRF Protection** (`docs/CSRF_PROTECTION.md`)
|
1. **CSRF Protection** (`docs/CSRF_PROTECTION.md`)
|
||||||
- Multi-layer protection implementation
|
|
||||||
- Client-side integration guide
|
- Multi-layer protection implementation
|
||||||
- tRPC integration details
|
- Client-side integration guide
|
||||||
- Comprehensive examples
|
- tRPC integration details
|
||||||
|
- Comprehensive examples
|
||||||
|
|
||||||
2. **Enhanced CSP Implementation** (`docs/security/enhanced-csp.md`)
|
2. **Enhanced CSP Implementation** (`docs/security/enhanced-csp.md`)
|
||||||
- Nonce-based script execution
|
|
||||||
- Environment-specific policies
|
- Nonce-based script execution
|
||||||
- Violation reporting and monitoring
|
- Environment-specific policies
|
||||||
- Testing framework
|
- Violation reporting and monitoring
|
||||||
|
- Testing framework
|
||||||
|
|
||||||
3. **Security Headers** (`docs/security-headers.md`)
|
3. **Security Headers** (`docs/security-headers.md`)
|
||||||
- Complete header implementation details
|
|
||||||
- Testing procedures
|
- Complete header implementation details
|
||||||
- Compatibility information
|
- Testing procedures
|
||||||
|
- Compatibility information
|
||||||
|
|
||||||
4. **Security Monitoring System** (`docs/security-monitoring.md`)
|
4. **Security Monitoring System** (`docs/security-monitoring.md`)
|
||||||
- Real-time threat detection
|
|
||||||
- Alert management
|
- Real-time threat detection
|
||||||
- API usage examples
|
- Alert management
|
||||||
- Performance considerations
|
- API usage examples
|
||||||
|
- Performance considerations
|
||||||
|
|
||||||
5. **Migration Guide** (`MIGRATION_GUIDE.md`)
|
5. **Migration Guide** (`MIGRATION_GUIDE.md`)
|
||||||
- Comprehensive v2.0.0 migration procedures
|
|
||||||
- Rollback procedures
|
- Comprehensive v2.0.0 migration procedures
|
||||||
- Health checks and validation
|
- Rollback procedures
|
||||||
|
- Health checks and validation
|
||||||
|
|
||||||
### Major Issues Identified ❌
|
### Major Issues Identified ❌
|
||||||
|
|
||||||
@ -44,70 +49,70 @@ The following areas were found to have comprehensive, accurate documentation:
|
|||||||
|
|
||||||
**Problems Found:**
|
**Problems Found:**
|
||||||
|
|
||||||
- Listed database as "SQLite (default)" when project uses PostgreSQL
|
- Listed database as "SQLite (default)" when project uses PostgreSQL
|
||||||
- Missing all new security features (CSRF, CSP, security monitoring)
|
- Missing all new security features (CSRF, CSP, security monitoring)
|
||||||
- Incomplete environment setup section
|
- Incomplete environment setup section
|
||||||
- Outdated tech stack (missing tRPC, security features)
|
- Outdated tech stack (missing tRPC, security features)
|
||||||
- Project structure didn't reflect new admin/security directories
|
- Project structure didn't reflect new admin/security directories
|
||||||
|
|
||||||
**Actions Taken:**
|
**Actions Taken:**
|
||||||
|
|
||||||
- ✅ Updated features section to include security and admin capabilities
|
- ✅ Updated features section to include security and admin capabilities
|
||||||
- ✅ Corrected tech stack to include PostgreSQL, tRPC, security features
|
- ✅ Corrected tech stack to include PostgreSQL, tRPC, security features
|
||||||
- ✅ Updated environment setup with proper PostgreSQL configuration
|
- ✅ Updated environment setup with proper PostgreSQL configuration
|
||||||
- ✅ Revised project structure to reflect current codebase
|
- ✅ Revised project structure to reflect current codebase
|
||||||
- ✅ Added comprehensive script documentation
|
- ✅ Added comprehensive script documentation
|
||||||
|
|
||||||
#### 2. Undocumented API Endpoints
|
#### 2. Undocumented API Endpoints
|
||||||
|
|
||||||
**Missing Documentation:**
|
**Missing Documentation:**
|
||||||
|
|
||||||
- `/api/admin/audit-logs/` (GET) - Audit log retrieval with filtering
|
- `/api/admin/audit-logs/` (GET) - Audit log retrieval with filtering
|
||||||
- `/api/admin/audit-logs/retention/` (POST) - Retention management
|
- `/api/admin/audit-logs/retention/` (POST) - Retention management
|
||||||
- `/api/admin/security-monitoring/` (GET/POST) - Security metrics and config
|
- `/api/admin/security-monitoring/` (GET/POST) - Security metrics and config
|
||||||
- `/api/admin/security-monitoring/alerts/` - Alert management
|
- `/api/admin/security-monitoring/alerts/` - Alert management
|
||||||
- `/api/admin/security-monitoring/export/` - Data export
|
- `/api/admin/security-monitoring/export/` - Data export
|
||||||
- `/api/admin/security-monitoring/threat-analysis/` - Threat analysis
|
- `/api/admin/security-monitoring/threat-analysis/` - Threat analysis
|
||||||
- `/api/admin/batch-monitoring/` - Batch processing monitoring
|
- `/api/admin/batch-monitoring/` - Batch processing monitoring
|
||||||
- `/api/csp-report/` (POST) - CSP violation reporting
|
- `/api/csp-report/` (POST) - CSP violation reporting
|
||||||
- `/api/csp-metrics/` (GET) - CSP metrics and analytics
|
- `/api/csp-metrics/` (GET) - CSP metrics and analytics
|
||||||
- `/api/csrf-token/` (GET) - CSRF token endpoint
|
- `/api/csrf-token/` (GET) - CSRF token endpoint
|
||||||
|
|
||||||
**Actions Taken:**
|
**Actions Taken:**
|
||||||
|
|
||||||
- ✅ Created `docs/admin-audit-logs-api.md` - Comprehensive audit logs API documentation
|
- ✅ Created `docs/admin-audit-logs-api.md` - Comprehensive audit logs API documentation
|
||||||
- ✅ Created `docs/csp-metrics-api.md` - CSP monitoring and metrics API documentation
|
- ✅ Created `docs/csp-metrics-api.md` - CSP monitoring and metrics API documentation
|
||||||
- ✅ Created `docs/api-reference.md` - Complete API reference for all endpoints
|
- ✅ Created `docs/api-reference.md` - Complete API reference for all endpoints
|
||||||
|
|
||||||
#### 3. Undocumented Features and Components
|
#### 3. Undocumented Features and Components
|
||||||
|
|
||||||
**Missing Feature Documentation:**
|
**Missing Feature Documentation:**
|
||||||
|
|
||||||
- Batch monitoring dashboard and UI components
|
- Batch monitoring dashboard and UI components
|
||||||
- Security monitoring UI components
|
- Security monitoring UI components
|
||||||
- Nonce-based CSP context provider
|
- Nonce-based CSP context provider
|
||||||
- Enhanced rate limiting system
|
- Enhanced rate limiting system
|
||||||
- Security audit retention system
|
- Security audit retention system
|
||||||
|
|
||||||
**Actions Taken:**
|
**Actions Taken:**
|
||||||
|
|
||||||
- ✅ Created `docs/batch-monitoring-dashboard.md` - Complete batch monitoring documentation
|
- ✅ Created `docs/batch-monitoring-dashboard.md` - Complete batch monitoring documentation
|
||||||
|
|
||||||
#### 4. CLAUDE.md - Missing New Commands
|
#### 4. CLAUDE.md - Missing New Commands
|
||||||
|
|
||||||
**Problems Found:**
|
**Problems Found:**
|
||||||
|
|
||||||
- Missing security testing commands
|
- Missing security testing commands
|
||||||
- Missing CSP testing commands
|
- Missing CSP testing commands
|
||||||
- Missing migration/deployment commands
|
- Missing migration/deployment commands
|
||||||
- Outdated security features section
|
- Outdated security features section
|
||||||
|
|
||||||
**Actions Taken:**
|
**Actions Taken:**
|
||||||
|
|
||||||
- ✅ Added security testing command section
|
- ✅ Added security testing command section
|
||||||
- ✅ Added CSP testing commands
|
- ✅ Added CSP testing commands
|
||||||
- ✅ Added migration and deployment commands
|
- ✅ Added migration and deployment commands
|
||||||
- ✅ Updated security features section with comprehensive details
|
- ✅ Updated security features section with comprehensive details
|
||||||
|
|
||||||
## New Documentation Created
|
## New Documentation Created
|
||||||
|
|
||||||
@ -117,14 +122,14 @@ The following areas were found to have comprehensive, accurate documentation:
|
|||||||
|
|
||||||
**Contents:**
|
**Contents:**
|
||||||
|
|
||||||
- Complete API endpoint documentation with examples
|
- Complete API endpoint documentation with examples
|
||||||
- Authentication and authorization requirements
|
- Authentication and authorization requirements
|
||||||
- Query parameters and filtering options
|
- Query parameters and filtering options
|
||||||
- Response formats and error handling
|
- Response formats and error handling
|
||||||
- Retention management procedures
|
- Retention management procedures
|
||||||
- Security features and rate limiting
|
- Security features and rate limiting
|
||||||
- Usage examples and integration patterns
|
- Usage examples and integration patterns
|
||||||
- Performance considerations and troubleshooting
|
- Performance considerations and troubleshooting
|
||||||
|
|
||||||
### 2. CSP Metrics and Monitoring API Documentation
|
### 2. CSP Metrics and Monitoring API Documentation
|
||||||
|
|
||||||
@ -132,14 +137,14 @@ The following areas were found to have comprehensive, accurate documentation:
|
|||||||
|
|
||||||
**Contents:**
|
**Contents:**
|
||||||
|
|
||||||
- CSP violation reporting endpoint documentation
|
- CSP violation reporting endpoint documentation
|
||||||
- Metrics API with real-time violation tracking
|
- Metrics API with real-time violation tracking
|
||||||
- Risk assessment and bypass detection features
|
- Risk assessment and bypass detection features
|
||||||
- Policy optimization recommendations
|
- Policy optimization recommendations
|
||||||
- Configuration and setup instructions
|
- Configuration and setup instructions
|
||||||
- Performance considerations and security features
|
- Performance considerations and security features
|
||||||
- Usage examples for monitoring and analysis
|
- Usage examples for monitoring and analysis
|
||||||
- Integration with existing security systems
|
- Integration with existing security systems
|
||||||
|
|
||||||
### 3. Batch Monitoring Dashboard Documentation
|
### 3. Batch Monitoring Dashboard Documentation
|
||||||
|
|
||||||
@ -147,14 +152,14 @@ The following areas were found to have comprehensive, accurate documentation:
|
|||||||
|
|
||||||
**Contents:**
|
**Contents:**
|
||||||
|
|
||||||
- Comprehensive batch processing monitoring guide
|
- Comprehensive batch processing monitoring guide
|
||||||
- Real-time monitoring capabilities and features
|
- Real-time monitoring capabilities and features
|
||||||
- API endpoints for batch job tracking
|
- API endpoints for batch job tracking
|
||||||
- Dashboard component documentation
|
- Dashboard component documentation
|
||||||
- Performance analytics and cost analysis
|
- Performance analytics and cost analysis
|
||||||
- Administrative controls and error handling
|
- Administrative controls and error handling
|
||||||
- Configuration and alert management
|
- Configuration and alert management
|
||||||
- Troubleshooting and optimization guides
|
- Troubleshooting and optimization guides
|
||||||
|
|
||||||
### 4. Complete API Reference
|
### 4. Complete API Reference
|
||||||
|
|
||||||
@ -162,14 +167,14 @@ The following areas were found to have comprehensive, accurate documentation:
|
|||||||
|
|
||||||
**Contents:**
|
**Contents:**
|
||||||
|
|
||||||
- Comprehensive reference for all API endpoints
|
- Comprehensive reference for all API endpoints
|
||||||
- Authentication and CSRF protection requirements
|
- Authentication and CSRF protection requirements
|
||||||
- Detailed request/response formats
|
- Detailed request/response formats
|
||||||
- Error codes and status descriptions
|
- Error codes and status descriptions
|
||||||
- Rate limiting information
|
- Rate limiting information
|
||||||
- Security headers and CORS configuration
|
- Security headers and CORS configuration
|
||||||
- Pagination and filtering standards
|
- Pagination and filtering standards
|
||||||
- Testing and integration examples
|
- Testing and integration examples
|
||||||
|
|
||||||
## Updated Documentation
|
## Updated Documentation
|
||||||
|
|
||||||
@ -177,23 +182,23 @@ The following areas were found to have comprehensive, accurate documentation:
|
|||||||
|
|
||||||
**Key Updates:**
|
**Key Updates:**
|
||||||
|
|
||||||
- ✅ Updated project description to include security and admin features
|
- ✅ Updated project description to include security and admin features
|
||||||
- ✅ Corrected tech stack to reflect current implementation
|
- ✅ Corrected tech stack to reflect current implementation
|
||||||
- ✅ Fixed database information (PostgreSQL vs SQLite)
|
- ✅ Fixed database information (PostgreSQL vs SQLite)
|
||||||
- ✅ Added comprehensive environment configuration
|
- ✅ Added comprehensive environment configuration
|
||||||
- ✅ Updated project structure to match current codebase
|
- ✅ Updated project structure to match current codebase
|
||||||
- ✅ Added security, migration, and testing command sections
|
- ✅ Added security, migration, and testing command sections
|
||||||
- ✅ Enhanced features section with detailed capabilities
|
- ✅ Enhanced features section with detailed capabilities
|
||||||
|
|
||||||
### 2. CLAUDE.md - Enhanced Developer Guide
|
### 2. CLAUDE.md - Enhanced Developer Guide
|
||||||
|
|
||||||
**Key Updates:**
|
**Key Updates:**
|
||||||
|
|
||||||
- ✅ Added security testing commands section
|
- ✅ Added security testing commands section
|
||||||
- ✅ Added CSP testing and validation commands
|
- ✅ Added CSP testing and validation commands
|
||||||
- ✅ Added migration and deployment commands
|
- ✅ Added migration and deployment commands
|
||||||
- ✅ Enhanced security features documentation
|
- ✅ Enhanced security features documentation
|
||||||
- ✅ Updated with comprehensive CSRF, CSP, and monitoring details
|
- ✅ Updated with comprehensive CSRF, CSP, and monitoring details
|
||||||
|
|
||||||
## Documentation Quality Assessment
|
## Documentation Quality Assessment
|
||||||
|
|
||||||
@ -212,53 +217,53 @@ The following areas were found to have comprehensive, accurate documentation:
|
|||||||
|
|
||||||
All new and updated documentation follows these standards:
|
All new and updated documentation follows these standards:
|
||||||
|
|
||||||
- ✅ Clear, actionable examples
|
- ✅ Clear, actionable examples
|
||||||
- ✅ Comprehensive API documentation with request/response examples
|
- ✅ Comprehensive API documentation with request/response examples
|
||||||
- ✅ Security considerations and best practices
|
- ✅ Security considerations and best practices
|
||||||
- ✅ Troubleshooting sections
|
- ✅ Troubleshooting sections
|
||||||
- ✅ Integration patterns and usage examples
|
- ✅ Integration patterns and usage examples
|
||||||
- ✅ Performance considerations
|
- ✅ Performance considerations
|
||||||
- ✅ Cross-references to related documentation
|
- ✅ Cross-references to related documentation
|
||||||
|
|
||||||
## Recommendations for Maintenance
|
## Recommendations for Maintenance
|
||||||
|
|
||||||
### 1. Regular Review Schedule
|
### 1. Regular Review Schedule
|
||||||
|
|
||||||
- **Monthly**: Review API documentation for new endpoints
|
- **Monthly**: Review API documentation for new endpoints
|
||||||
- **Quarterly**: Update security feature documentation
|
- **Quarterly**: Update security feature documentation
|
||||||
- **Per Release**: Validate all examples and code snippets
|
- **Per Release**: Validate all examples and code snippets
|
||||||
- **Annually**: Comprehensive documentation audit
|
- **Annually**: Comprehensive documentation audit
|
||||||
|
|
||||||
### 2. Documentation Automation
|
### 2. Documentation Automation
|
||||||
|
|
||||||
- Add documentation checks to CI/CD pipeline
|
- Add documentation checks to CI/CD pipeline
|
||||||
- Implement API documentation generation from OpenAPI specs
|
- Implement API documentation generation from OpenAPI specs
|
||||||
- Set up automated link checking
|
- Set up automated link checking
|
||||||
- Create documentation review templates
|
- Create documentation review templates
|
||||||
|
|
||||||
### 3. Developer Onboarding
|
### 3. Developer Onboarding
|
||||||
|
|
||||||
- Use updated documentation for new developer onboarding
|
- Use updated documentation for new developer onboarding
|
||||||
- Create documentation feedback process
|
- Create documentation feedback process
|
||||||
- Maintain documentation contribution guidelines
|
- Maintain documentation contribution guidelines
|
||||||
- Track documentation usage and feedback
|
- Track documentation usage and feedback
|
||||||
|
|
||||||
### 4. Continuous Improvement
|
### 4. Continuous Improvement
|
||||||
|
|
||||||
- Monitor documentation gaps through developer feedback
|
- Monitor documentation gaps through developer feedback
|
||||||
- Update examples with real-world usage patterns
|
- Update examples with real-world usage patterns
|
||||||
- Enhance troubleshooting sections based on support issues
|
- Enhance troubleshooting sections based on support issues
|
||||||
- Keep security documentation current with threat landscape
|
- Keep security documentation current with threat landscape
|
||||||
|
|
||||||
## Summary
|
## Summary
|
||||||
|
|
||||||
The documentation audit identified significant gaps in API documentation, outdated project information, and missing coverage of new security features. Through comprehensive updates and new documentation creation, the project now has:
|
The documentation audit identified significant gaps in API documentation, outdated project information, and missing coverage of new security features. Through comprehensive updates and new documentation creation, the project now has:
|
||||||
|
|
||||||
- **Complete API Reference**: All endpoints documented with examples
|
- **Complete API Reference**: All endpoints documented with examples
|
||||||
- **Accurate Project Information**: README and CLAUDE.md reflect current state
|
- **Accurate Project Information**: README and CLAUDE.md reflect current state
|
||||||
- **Comprehensive Security Documentation**: All security features thoroughly documented
|
- **Comprehensive Security Documentation**: All security features thoroughly documented
|
||||||
- **Developer-Friendly Guides**: Clear setup, testing, and deployment procedures
|
- **Developer-Friendly Guides**: Clear setup, testing, and deployment procedures
|
||||||
- **Administrative Documentation**: Complete coverage of admin and monitoring features
|
- **Administrative Documentation**: Complete coverage of admin and monitoring features
|
||||||
|
|
||||||
The documentation is now production-ready and provides comprehensive guidance for developers, administrators, and security teams working with the LiveDash-Node application.
|
The documentation is now production-ready and provides comprehensive guidance for developers, administrators, and security teams working with the LiveDash-Node application.
|
||||||
|
|
||||||
|
|||||||
@ -20,29 +20,29 @@ Can't reach database server at `ep-tiny-math-a2zsshve-pooler.eu-central-1.aws.ne
|
|||||||
|
|
||||||
### 1. Connection Retry Logic (`lib/database-retry.ts`)
|
### 1. Connection Retry Logic (`lib/database-retry.ts`)
|
||||||
|
|
||||||
- **Automatic retry** for connection errors
|
- **Automatic retry** for connection errors
|
||||||
- **Exponential backoff** (1s → 2s → 4s → 10s max)
|
- **Exponential backoff** (1s → 2s → 4s → 10s max)
|
||||||
- **Smart error detection** (only retry connection issues)
|
- **Smart error detection** (only retry connection issues)
|
||||||
- **Configurable retry attempts** (default: 3 retries)
|
- **Configurable retry attempts** (default: 3 retries)
|
||||||
|
|
||||||
### 2. Enhanced Schedulers
|
### 2. Enhanced Schedulers
|
||||||
|
|
||||||
- **Import Processor**: Added retry wrapper around main processing
|
- **Import Processor**: Added retry wrapper around main processing
|
||||||
- **Session Processor**: Added retry wrapper around AI processing
|
- **Session Processor**: Added retry wrapper around AI processing
|
||||||
- **Graceful degradation** when database is temporarily unavailable
|
- **Graceful degradation** when database is temporarily unavailable
|
||||||
|
|
||||||
### 3. Singleton Pattern Enforced
|
### 3. Singleton Pattern Enforced
|
||||||
|
|
||||||
- **All schedulers now use** `import { prisma } from "./prisma.js"`
|
- **All schedulers now use** `import { prisma } from "./prisma.js"`
|
||||||
- **No more separate** `new PrismaClient()` instances
|
- **No more separate** `new PrismaClient()` instances
|
||||||
- **Shared connection pool** across all operations
|
- **Shared connection pool** across all operations
|
||||||
|
|
||||||
### 4. Neon-Specific Optimizations
|
### 4. Neon-Specific Optimizations
|
||||||
|
|
||||||
- **Connection limit guidance**: 15 connections (below Neon's 20 limit)
|
- **Connection limit guidance**: 15 connections (below Neon's 20 limit)
|
||||||
- **Extended timeouts**: 30s for cold start handling
|
- **Extended timeouts**: 30s for cold start handling
|
||||||
- **SSL mode requirements**: `sslmode=require` for Neon
|
- **SSL mode requirements**: `sslmode=require` for Neon
|
||||||
- **Application naming**: For better monitoring
|
- **Application naming**: For better monitoring
|
||||||
|
|
||||||
## Immediate Actions Needed
|
## Immediate Actions Needed
|
||||||
|
|
||||||
@ -83,17 +83,17 @@ pnpm db:check
|
|||||||
|
|
||||||
## Monitoring
|
## Monitoring
|
||||||
|
|
||||||
- **Health Endpoint**: `/api/admin/database-health`
|
- **Health Endpoint**: `/api/admin/database-health`
|
||||||
- **Connection Logs**: Enhanced logging for pool events
|
- **Connection Logs**: Enhanced logging for pool events
|
||||||
- **Retry Logs**: Detailed retry attempt logging
|
- **Retry Logs**: Detailed retry attempt logging
|
||||||
- **Error Classification**: Retryable vs non-retryable errors
|
- **Error Classification**: Retryable vs non-retryable errors
|
||||||
|
|
||||||
## Files Modified
|
## Files Modified
|
||||||
|
|
||||||
- `lib/database-retry.ts` - New retry utilities
|
- `lib/database-retry.ts` - New retry utilities
|
||||||
- `lib/importProcessor.ts` - Added retry wrapper
|
- `lib/importProcessor.ts` - Added retry wrapper
|
||||||
- `lib/processingScheduler.ts` - Added retry wrapper
|
- `lib/processingScheduler.ts` - Added retry wrapper
|
||||||
- `docs/neon-database-optimization.md` - Neon-specific guide
|
- `docs/neon-database-optimization.md` - Neon-specific guide
|
||||||
- `scripts/check-database-config.ts` - Configuration checker
|
- `scripts/check-database-config.ts` - Configuration checker
|
||||||
|
|
||||||
The connection issues should be significantly reduced with these fixes! 🎯
|
The connection issues should be significantly reduced with these fixes! 🎯
|
||||||
|
|||||||
@ -8,43 +8,43 @@ This guide provides step-by-step instructions for migrating LiveDash Node to ver
|
|||||||
|
|
||||||
### tRPC Implementation
|
### tRPC Implementation
|
||||||
|
|
||||||
- **Type-safe APIs**: End-to-end TypeScript safety from client to server
|
- **Type-safe APIs**: End-to-end TypeScript safety from client to server
|
||||||
- **Improved Performance**: Optimized query batching and caching
|
- **Improved Performance**: Optimized query batching and caching
|
||||||
- **Better Developer Experience**: Auto-completion and type checking
|
- **Better Developer Experience**: Auto-completion and type checking
|
||||||
- **Simplified Authentication**: Integrated with existing NextAuth.js setup
|
- **Simplified Authentication**: Integrated with existing NextAuth.js setup
|
||||||
|
|
||||||
### OpenAI Batch API Integration
|
### OpenAI Batch API Integration
|
||||||
|
|
||||||
- **50% Cost Reduction**: Batch processing reduces OpenAI API costs by half
|
- **50% Cost Reduction**: Batch processing reduces OpenAI API costs by half
|
||||||
- **Enhanced Rate Limiting**: Better throughput management
|
- **Enhanced Rate Limiting**: Better throughput management
|
||||||
- **Improved Reliability**: Automatic retry mechanisms and error handling
|
- **Improved Reliability**: Automatic retry mechanisms and error handling
|
||||||
- **Automated Processing**: Background batch job lifecycle management
|
- **Automated Processing**: Background batch job lifecycle management
|
||||||
|
|
||||||
### Enhanced Security & Performance
|
### Enhanced Security & Performance
|
||||||
|
|
||||||
- **Rate Limiting**: In-memory rate limiting for all authentication endpoints
|
- **Rate Limiting**: In-memory rate limiting for all authentication endpoints
|
||||||
- **Input Validation**: Comprehensive Zod schemas for all user inputs
|
- **Input Validation**: Comprehensive Zod schemas for all user inputs
|
||||||
- **Performance Monitoring**: Built-in metrics collection and monitoring
|
- **Performance Monitoring**: Built-in metrics collection and monitoring
|
||||||
- **Database Optimizations**: New indexes and query optimizations
|
- **Database Optimizations**: New indexes and query optimizations
|
||||||
|
|
||||||
## 📋 Pre-Migration Checklist
|
## 📋 Pre-Migration Checklist
|
||||||
|
|
||||||
### System Requirements
|
### System Requirements
|
||||||
|
|
||||||
- [ ] Node.js 18+ installed
|
- [ ] Node.js 18+ installed
|
||||||
- [ ] PostgreSQL 13+ database
|
- [ ] PostgreSQL 13+ database
|
||||||
- [ ] `pg_dump` and `pg_restore` utilities available
|
- [ ] `pg_dump` and `pg_restore` utilities available
|
||||||
- [ ] Git repository with clean working directory
|
- [ ] Git repository with clean working directory
|
||||||
- [ ] OpenAI API key (for production)
|
- [ ] OpenAI API key (for production)
|
||||||
- [ ] Sufficient disk space for backups (at least 2GB)
|
- [ ] Sufficient disk space for backups (at least 2GB)
|
||||||
|
|
||||||
### Environment Preparation
|
### Environment Preparation
|
||||||
|
|
||||||
- [ ] Review current environment variables
|
- [ ] Review current environment variables
|
||||||
- [ ] Ensure database connection is working
|
- [ ] Ensure database connection is working
|
||||||
- [ ] Verify all tests are passing
|
- [ ] Verify all tests are passing
|
||||||
- [ ] Create a backup of your current deployment
|
- [ ] Create a backup of your current deployment
|
||||||
- [ ] Notify team members of planned downtime
|
- [ ] Notify team members of planned downtime
|
||||||
|
|
||||||
## 🔧 Migration Process
|
## 🔧 Migration Process
|
||||||
|
|
||||||
@ -246,31 +246,31 @@ tail -f logs/migration.log
|
|||||||
|
|
||||||
#### 2. tRPC Performance
|
#### 2. tRPC Performance
|
||||||
|
|
||||||
- Monitor response times for tRPC endpoints
|
- Monitor response times for tRPC endpoints
|
||||||
- Check error rates in application logs
|
- Check error rates in application logs
|
||||||
- Verify type safety is working correctly
|
- Verify type safety is working correctly
|
||||||
|
|
||||||
#### 3. Batch Processing
|
#### 3. Batch Processing
|
||||||
|
|
||||||
- Monitor batch job completion rates
|
- Monitor batch job completion rates
|
||||||
- Check OpenAI API cost reduction
|
- Check OpenAI API cost reduction
|
||||||
- Verify AI processing pipeline functionality
|
- Verify AI processing pipeline functionality
|
||||||
|
|
||||||
### Key Metrics to Monitor
|
### Key Metrics to Monitor
|
||||||
|
|
||||||
#### Performance Metrics
|
#### Performance Metrics
|
||||||
|
|
||||||
- **Response Times**: tRPC endpoints should respond within 500ms
|
- **Response Times**: tRPC endpoints should respond within 500ms
|
||||||
- **Database Queries**: Complex queries should complete within 1s
|
- **Database Queries**: Complex queries should complete within 1s
|
||||||
- **Memory Usage**: Should remain below 80% of allocated memory
|
- **Memory Usage**: Should remain below 80% of allocated memory
|
||||||
- **CPU Usage**: Process should remain responsive
|
- **CPU Usage**: Process should remain responsive
|
||||||
|
|
||||||
#### Business Metrics
|
#### Business Metrics
|
||||||
|
|
||||||
- **AI Processing Cost**: Should see ~50% reduction in OpenAI costs
|
- **AI Processing Cost**: Should see ~50% reduction in OpenAI costs
|
||||||
- **Processing Throughput**: Batch processing should handle larger volumes
|
- **Processing Throughput**: Batch processing should handle larger volumes
|
||||||
- **Error Rates**: Should remain below 1% for critical operations
|
- **Error Rates**: Should remain below 1% for critical operations
|
||||||
- **User Experience**: No degradation in dashboard performance
|
- **User Experience**: No degradation in dashboard performance
|
||||||
|
|
||||||
## 🛠 Troubleshooting
|
## 🛠 Troubleshooting
|
||||||
|
|
||||||
@ -366,69 +366,69 @@ pnpm prisma db pull --print
|
|||||||
|
|
||||||
### Immediate Tasks (First 24 Hours)
|
### Immediate Tasks (First 24 Hours)
|
||||||
|
|
||||||
- [ ] Monitor application logs for errors
|
- [ ] Monitor application logs for errors
|
||||||
- [ ] Verify all tRPC endpoints are responding correctly
|
- [ ] Verify all tRPC endpoints are responding correctly
|
||||||
- [ ] Check batch processing job completion
|
- [ ] Check batch processing job completion
|
||||||
- [ ] Validate AI cost reduction in OpenAI dashboard
|
- [ ] Validate AI cost reduction in OpenAI dashboard
|
||||||
- [ ] Run full test suite to ensure no regressions
|
- [ ] Run full test suite to ensure no regressions
|
||||||
- [ ] Update documentation and team knowledge
|
- [ ] Update documentation and team knowledge
|
||||||
|
|
||||||
### Medium-term Tasks (First Week)
|
### Medium-term Tasks (First Week)
|
||||||
|
|
||||||
- [ ] Optimize batch processing parameters based on usage
|
- [ ] Optimize batch processing parameters based on usage
|
||||||
- [ ] Fine-tune rate limiting settings
|
- [ ] Fine-tune rate limiting settings
|
||||||
- [ ] Set up monitoring alerts for new components
|
- [ ] Set up monitoring alerts for new components
|
||||||
- [ ] Train team on new tRPC APIs
|
- [ ] Train team on new tRPC APIs
|
||||||
- [ ] Plan gradual feature adoption
|
- [ ] Plan gradual feature adoption
|
||||||
|
|
||||||
### Long-term Tasks (First Month)
|
### Long-term Tasks (First Month)
|
||||||
|
|
||||||
- [ ] Analyze cost savings and performance improvements
|
- [ ] Analyze cost savings and performance improvements
|
||||||
- [ ] Consider additional tRPC endpoint implementations
|
- [ ] Consider additional tRPC endpoint implementations
|
||||||
- [ ] Optimize batch processing schedules
|
- [ ] Optimize batch processing schedules
|
||||||
- [ ] Review and adjust security settings
|
- [ ] Review and adjust security settings
|
||||||
- [ ] Plan next phase improvements
|
- [ ] Plan next phase improvements
|
||||||
|
|
||||||
## 🔒 Security Considerations
|
## 🔒 Security Considerations
|
||||||
|
|
||||||
### New Security Features
|
### New Security Features
|
||||||
|
|
||||||
- **Enhanced Rate Limiting**: Applied to all authentication endpoints
|
- **Enhanced Rate Limiting**: Applied to all authentication endpoints
|
||||||
- **Input Validation**: Comprehensive Zod schemas prevent injection attacks
|
- **Input Validation**: Comprehensive Zod schemas prevent injection attacks
|
||||||
- **Secure Headers**: HTTPS enforcement in production
|
- **Secure Headers**: HTTPS enforcement in production
|
||||||
- **Token Security**: JWT with proper expiration and rotation
|
- **Token Security**: JWT with proper expiration and rotation
|
||||||
|
|
||||||
### Security Checklist
|
### Security Checklist
|
||||||
|
|
||||||
- [ ] Verify rate limiting is working correctly
|
- [ ] Verify rate limiting is working correctly
|
||||||
- [ ] Test input validation on all forms
|
- [ ] Test input validation on all forms
|
||||||
- [ ] Ensure HTTPS is enforced in production
|
- [ ] Ensure HTTPS is enforced in production
|
||||||
- [ ] Validate JWT token handling
|
- [ ] Validate JWT token handling
|
||||||
- [ ] Check for proper error message sanitization
|
- [ ] Check for proper error message sanitization
|
||||||
- [ ] Verify OpenAI API key is not exposed in logs
|
- [ ] Verify OpenAI API key is not exposed in logs
|
||||||
|
|
||||||
## 📈 Expected Improvements
|
## 📈 Expected Improvements
|
||||||
|
|
||||||
### Performance Improvements
|
### Performance Improvements
|
||||||
|
|
||||||
- **50% reduction** in OpenAI API costs through batch processing
|
- **50% reduction** in OpenAI API costs through batch processing
|
||||||
- **30% improvement** in API response times with tRPC
|
- **30% improvement** in API response times with tRPC
|
||||||
- **25% reduction** in database query time with new indexes
|
- **25% reduction** in database query time with new indexes
|
||||||
- **Enhanced scalability** for processing larger session volumes
|
- **Enhanced scalability** for processing larger session volumes
|
||||||
|
|
||||||
### Developer Experience
|
### Developer Experience
|
||||||
|
|
||||||
- **Type Safety**: End-to-end TypeScript types from client to server
|
- **Type Safety**: End-to-end TypeScript types from client to server
|
||||||
- **Better APIs**: Self-documenting tRPC procedures
|
- **Better APIs**: Self-documenting tRPC procedures
|
||||||
- **Improved Testing**: More reliable test suite with better validation
|
- **Improved Testing**: More reliable test suite with better validation
|
||||||
- **Enhanced Monitoring**: Detailed health checks and reporting
|
- **Enhanced Monitoring**: Detailed health checks and reporting
|
||||||
|
|
||||||
### Operational Benefits
|
### Operational Benefits
|
||||||
|
|
||||||
- **Automated Batch Processing**: Reduced manual intervention
|
- **Automated Batch Processing**: Reduced manual intervention
|
||||||
- **Better Error Handling**: Comprehensive retry mechanisms
|
- **Better Error Handling**: Comprehensive retry mechanisms
|
||||||
- **Improved Monitoring**: Real-time health status and metrics
|
- **Improved Monitoring**: Real-time health status and metrics
|
||||||
- **Simplified Deployment**: Automated migration and rollback procedures
|
- **Simplified Deployment**: Automated migration and rollback procedures
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
140
README.md
140
README.md
@ -12,46 +12,46 @@ A comprehensive real-time analytics dashboard for monitoring user sessions with
|
|||||||
|
|
||||||
### Core Analytics
|
### Core Analytics
|
||||||
|
|
||||||
- **Real-time Session Monitoring**: Track and analyze user sessions as they happen
|
- **Real-time Session Monitoring**: Track and analyze user sessions as they happen
|
||||||
- **Interactive Visualizations**: Geographic maps, response time distributions, and advanced charts
|
- **Interactive Visualizations**: Geographic maps, response time distributions, and advanced charts
|
||||||
- **AI-Powered Analysis**: OpenAI integration with 50% cost reduction through batch processing
|
- **AI-Powered Analysis**: OpenAI integration with 50% cost reduction through batch processing
|
||||||
- **Advanced Analytics**: Detailed metrics and insights about user behavior patterns
|
- **Advanced Analytics**: Detailed metrics and insights about user behavior patterns
|
||||||
- **Session Details**: In-depth analysis of individual user sessions with transcript parsing
|
- **Session Details**: In-depth analysis of individual user sessions with transcript parsing
|
||||||
|
|
||||||
### Security & Admin Features
|
### Security & Admin Features
|
||||||
|
|
||||||
- **Enterprise Security**: Multi-layer security with CSRF protection, CSP, and rate limiting
|
- **Enterprise Security**: Multi-layer security with CSRF protection, CSP, and rate limiting
|
||||||
- **Security Monitoring**: Real-time threat detection and alerting system
|
- **Security Monitoring**: Real-time threat detection and alerting system
|
||||||
- **Audit Logging**: Comprehensive security audit trails with retention management
|
- **Audit Logging**: Comprehensive security audit trails with retention management
|
||||||
- **Admin Dashboard**: Advanced administration tools for user and system management
|
- **Admin Dashboard**: Advanced administration tools for user and system management
|
||||||
- **Geographic Threat Detection**: IP-based threat analysis and anomaly detection
|
- **Geographic Threat Detection**: IP-based threat analysis and anomaly detection
|
||||||
|
|
||||||
### Platform Management
|
### Platform Management
|
||||||
|
|
||||||
- **Multi-tenant Architecture**: Company-based data isolation and management
|
- **Multi-tenant Architecture**: Company-based data isolation and management
|
||||||
- **User Management**: Role-based access control with platform admin capabilities
|
- **User Management**: Role-based access control with platform admin capabilities
|
||||||
- **Batch Processing**: Optimized AI processing pipeline with automated scheduling
|
- **Batch Processing**: Optimized AI processing pipeline with automated scheduling
|
||||||
- **Data Export**: CSV/JSON export capabilities for analytics and audit data
|
- **Data Export**: CSV/JSON export capabilities for analytics and audit data
|
||||||
|
|
||||||
## Tech Stack
|
## Tech Stack
|
||||||
|
|
||||||
- **Frontend**: React 19, Next.js 15, TailwindCSS 4
|
- **Frontend**: React 19, Next.js 15, TailwindCSS 4
|
||||||
- **Backend**: Next.js API Routes, tRPC, Custom Node.js server
|
- **Backend**: Next.js API Routes, tRPC, Custom Node.js server
|
||||||
- **Database**: PostgreSQL with Prisma ORM and connection pooling
|
- **Database**: PostgreSQL with Prisma ORM and connection pooling
|
||||||
- **Authentication**: NextAuth.js with enhanced security features
|
- **Authentication**: NextAuth.js with enhanced security features
|
||||||
- **Security**: CSRF protection, CSP with nonce-based scripts, comprehensive rate limiting
|
- **Security**: CSRF protection, CSP with nonce-based scripts, comprehensive rate limiting
|
||||||
- **AI Processing**: OpenAI API with batch processing for cost optimization
|
- **AI Processing**: OpenAI API with batch processing for cost optimization
|
||||||
- **Visualization**: D3.js, React Leaflet, Recharts, custom chart components
|
- **Visualization**: D3.js, React Leaflet, Recharts, custom chart components
|
||||||
- **Monitoring**: Real-time security monitoring, audit logging, threat detection
|
- **Monitoring**: Real-time security monitoring, audit logging, threat detection
|
||||||
- **Data Processing**: Node-cron schedulers for automated batch processing and AI analysis
|
- **Data Processing**: Node-cron schedulers for automated batch processing and AI analysis
|
||||||
|
|
||||||
## Getting Started
|
## Getting Started
|
||||||
|
|
||||||
### Prerequisites
|
### Prerequisites
|
||||||
|
|
||||||
- Node.js 18+ (LTS version recommended)
|
- Node.js 18+ (LTS version recommended)
|
||||||
- pnpm (recommended package manager)
|
- pnpm (recommended package manager)
|
||||||
- PostgreSQL 13+ database
|
- PostgreSQL 13+ database
|
||||||
|
|
||||||
### Installation
|
### Installation
|
||||||
|
|
||||||
@ -126,61 +126,61 @@ BATCH_RESULT_PROCESSING_INTERVAL="*/1 * * * *"
|
|||||||
|
|
||||||
## Project Structure
|
## Project Structure
|
||||||
|
|
||||||
- `app/`: Next.js App Router pages and API routes
|
- `app/`: Next.js App Router pages and API routes
|
||||||
- `api/`: API endpoints including admin, security, and tRPC routes
|
- `api/`: API endpoints including admin, security, and tRPC routes
|
||||||
- `dashboard/`: Main analytics dashboard pages
|
- `dashboard/`: Main analytics dashboard pages
|
||||||
- `platform/`: Platform administration interface
|
- `platform/`: Platform administration interface
|
||||||
- `components/`: Reusable React components
|
- `components/`: Reusable React components
|
||||||
- `admin/`: Administrative dashboard components
|
- `admin/`: Administrative dashboard components
|
||||||
- `security/`: Security monitoring UI components
|
- `security/`: Security monitoring UI components
|
||||||
- `forms/`: CSRF-protected forms and form utilities
|
- `forms/`: CSRF-protected forms and form utilities
|
||||||
- `providers/`: Context providers (CSRF, tRPC, themes)
|
- `providers/`: Context providers (CSRF, tRPC, themes)
|
||||||
- `lib/`: Core utilities and business logic
|
- `lib/`: Core utilities and business logic
|
||||||
- Security modules (CSRF, CSP, rate limiting, audit logging)
|
- Security modules (CSRF, CSP, rate limiting, audit logging)
|
||||||
- Processing pipelines (batch processing, AI analysis)
|
- Processing pipelines (batch processing, AI analysis)
|
||||||
- Database utilities and authentication
|
- Database utilities and authentication
|
||||||
- `server/`: tRPC server configuration and routers
|
- `server/`: tRPC server configuration and routers
|
||||||
- `prisma/`: Database schema, migrations, and seed scripts
|
- `prisma/`: Database schema, migrations, and seed scripts
|
||||||
- `tests/`: Comprehensive test suite (unit, integration, E2E)
|
- `tests/`: Comprehensive test suite (unit, integration, E2E)
|
||||||
- `docs/`: Detailed project documentation
|
- `docs/`: Detailed project documentation
|
||||||
- `scripts/`: Migration and utility scripts
|
- `scripts/`: Migration and utility scripts
|
||||||
|
|
||||||
## Available Scripts
|
## Available Scripts
|
||||||
|
|
||||||
### Development
|
### Development
|
||||||
|
|
||||||
- `pnpm dev`: Start development server with all features
|
- `pnpm dev`: Start development server with all features
|
||||||
- `pnpm dev:next-only`: Start Next.js only (no background schedulers)
|
- `pnpm dev:next-only`: Start Next.js only (no background schedulers)
|
||||||
- `pnpm build`: Build the application for production
|
- `pnpm build`: Build the application for production
|
||||||
- `pnpm start`: Run the production build
|
- `pnpm start`: Run the production build
|
||||||
|
|
||||||
### Code Quality
|
### Code Quality
|
||||||
|
|
||||||
- `pnpm lint`: Run ESLint
|
- `pnpm lint`: Run ESLint
|
||||||
- `pnpm lint:fix`: Fix ESLint issues automatically
|
- `pnpm lint:fix`: Fix ESLint issues automatically
|
||||||
- `pnpm format`: Format code with Prettier
|
- `pnpm format`: Format code with Prettier
|
||||||
- `pnpm format:check`: Check code formatting
|
- `pnpm format:check`: Check code formatting
|
||||||
|
|
||||||
### Database
|
### Database
|
||||||
|
|
||||||
- `pnpm prisma:studio`: Open Prisma Studio to view database
|
- `pnpm prisma:studio`: Open Prisma Studio to view database
|
||||||
- `pnpm prisma:migrate`: Run database migrations
|
- `pnpm prisma:migrate`: Run database migrations
|
||||||
- `pnpm prisma:generate`: Generate Prisma client
|
- `pnpm prisma:generate`: Generate Prisma client
|
||||||
- `pnpm prisma:seed`: Seed database with test data
|
- `pnpm prisma:seed`: Seed database with test data
|
||||||
|
|
||||||
### Testing
|
### Testing
|
||||||
|
|
||||||
- `pnpm test`: Run all tests (Vitest + Playwright)
|
- `pnpm test`: Run all tests (Vitest + Playwright)
|
||||||
- `pnpm test:vitest`: Run unit and integration tests
|
- `pnpm test:vitest`: Run unit and integration tests
|
||||||
- `pnpm test:coverage`: Run tests with coverage reports
|
- `pnpm test:coverage`: Run tests with coverage reports
|
||||||
- `pnpm test:security`: Run security-specific tests
|
- `pnpm test:security`: Run security-specific tests
|
||||||
- `pnpm test:csp`: Test CSP implementation
|
- `pnpm test:csp`: Test CSP implementation
|
||||||
|
|
||||||
### Security & Migration
|
### Security & Migration
|
||||||
|
|
||||||
- `pnpm migration:backup`: Create database backup
|
- `pnpm migration:backup`: Create database backup
|
||||||
- `pnpm migration:health-check`: Run system health checks
|
- `pnpm migration:health-check`: Run system health checks
|
||||||
- `pnpm test:security-headers`: Test HTTP security headers
|
- `pnpm test:security-headers`: Test HTTP security headers
|
||||||
|
|
||||||
## Contributing
|
## Contributing
|
||||||
|
|
||||||
@ -196,9 +196,9 @@ This project is not licensed for commercial use without explicit permission. Fre
|
|||||||
|
|
||||||
## Acknowledgments
|
## Acknowledgments
|
||||||
|
|
||||||
- [Next.js](https://nextjs.org/)
|
- [Next.js](https://nextjs.org/)
|
||||||
- [Prisma](https://prisma.io/)
|
- [Prisma](https://prisma.io/)
|
||||||
- [TailwindCSS](https://tailwindcss.com/)
|
- [TailwindCSS](https://tailwindcss.com/)
|
||||||
- [Chart.js](https://www.chartjs.org/)
|
- [Chart.js](https://www.chartjs.org/)
|
||||||
- [D3.js](https://d3js.org/)
|
- [D3.js](https://d3js.org/)
|
||||||
- [React Leaflet](https://react-leaflet.js.org/)
|
- [React Leaflet](https://react-leaflet.js.org/)
|
||||||
|
|||||||
@ -37,7 +37,7 @@ function usePlatformSession() {
|
|||||||
const abortController = new AbortController();
|
const abortController = new AbortController();
|
||||||
|
|
||||||
const handleAuthSuccess = (sessionData: {
|
const handleAuthSuccess = (sessionData: {
|
||||||
user?: {
|
user?: {
|
||||||
id?: string;
|
id?: string;
|
||||||
email?: string;
|
email?: string;
|
||||||
name?: string;
|
name?: string;
|
||||||
@ -50,14 +50,14 @@ function usePlatformSession() {
|
|||||||
if (sessionData?.user?.isPlatformUser) {
|
if (sessionData?.user?.isPlatformUser) {
|
||||||
setSession({
|
setSession({
|
||||||
user: {
|
user: {
|
||||||
id: sessionData.user.id || '',
|
id: sessionData.user.id || "",
|
||||||
email: sessionData.user.email || '',
|
email: sessionData.user.email || "",
|
||||||
name: sessionData.user.name,
|
name: sessionData.user.name,
|
||||||
role: sessionData.user.role || '',
|
role: sessionData.user.role || "",
|
||||||
companyId: sessionData.user.companyId,
|
companyId: sessionData.user.companyId,
|
||||||
isPlatformUser: sessionData.user.isPlatformUser,
|
isPlatformUser: sessionData.user.isPlatformUser,
|
||||||
platformRole: sessionData.user.platformRole,
|
platformRole: sessionData.user.platformRole,
|
||||||
}
|
},
|
||||||
});
|
});
|
||||||
setStatus("authenticated");
|
setStatus("authenticated");
|
||||||
} else {
|
} else {
|
||||||
|
|||||||
@ -104,11 +104,7 @@ function SystemHealthCard({
|
|||||||
schedulerStatus,
|
schedulerStatus,
|
||||||
}: {
|
}: {
|
||||||
health: { status: string; message: string };
|
health: { status: string; message: string };
|
||||||
schedulerStatus: {
|
schedulerStatus: SchedulerStatus;
|
||||||
csvImport?: boolean;
|
|
||||||
processing?: boolean;
|
|
||||||
batch?: boolean;
|
|
||||||
};
|
|
||||||
}) {
|
}) {
|
||||||
return (
|
return (
|
||||||
<Card>
|
<Card>
|
||||||
@ -125,25 +121,33 @@ function SystemHealthCard({
|
|||||||
</div>
|
</div>
|
||||||
<div className="space-y-2">
|
<div className="space-y-2">
|
||||||
<div className="flex justify-between text-sm">
|
<div className="flex justify-between text-sm">
|
||||||
<span>CSV Import Scheduler:</span>
|
<span>Batch Creation:</span>
|
||||||
<Badge
|
<Badge
|
||||||
variant={schedulerStatus?.csvImport ? "default" : "secondary"}
|
variant={
|
||||||
|
schedulerStatus?.createBatchesRunning ? "default" : "secondary"
|
||||||
|
}
|
||||||
>
|
>
|
||||||
{schedulerStatus?.csvImport ? "Running" : "Stopped"}
|
{schedulerStatus?.createBatchesRunning ? "Running" : "Stopped"}
|
||||||
</Badge>
|
</Badge>
|
||||||
</div>
|
</div>
|
||||||
<div className="flex justify-between text-sm">
|
<div className="flex justify-between text-sm">
|
||||||
<span>Processing Scheduler:</span>
|
<span>Status Check:</span>
|
||||||
<Badge
|
<Badge
|
||||||
variant={schedulerStatus?.processing ? "default" : "secondary"}
|
variant={
|
||||||
|
schedulerStatus?.checkStatusRunning ? "default" : "secondary"
|
||||||
|
}
|
||||||
>
|
>
|
||||||
{schedulerStatus?.processing ? "Running" : "Stopped"}
|
{schedulerStatus?.checkStatusRunning ? "Running" : "Stopped"}
|
||||||
</Badge>
|
</Badge>
|
||||||
</div>
|
</div>
|
||||||
<div className="flex justify-between text-sm">
|
<div className="flex justify-between text-sm">
|
||||||
<span>Batch Scheduler:</span>
|
<span>Result Processing:</span>
|
||||||
<Badge variant={schedulerStatus?.batch ? "default" : "secondary"}>
|
<Badge
|
||||||
{schedulerStatus?.batch ? "Running" : "Stopped"}
|
variant={
|
||||||
|
schedulerStatus?.processResultsRunning ? "default" : "secondary"
|
||||||
|
}
|
||||||
|
>
|
||||||
|
{schedulerStatus?.processResultsRunning ? "Running" : "Stopped"}
|
||||||
</Badge>
|
</Badge>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@ -155,7 +159,7 @@ function SystemHealthCard({
|
|||||||
function CircuitBreakerCard({
|
function CircuitBreakerCard({
|
||||||
circuitBreakerStatus,
|
circuitBreakerStatus,
|
||||||
}: {
|
}: {
|
||||||
circuitBreakerStatus: Record<string, string> | null;
|
circuitBreakerStatus: Record<string, CircuitBreakerStatus> | null;
|
||||||
}) {
|
}) {
|
||||||
return (
|
return (
|
||||||
<Card>
|
<Card>
|
||||||
@ -172,10 +176,8 @@ function CircuitBreakerCard({
|
|||||||
{Object.entries(circuitBreakerStatus).map(([key, status]) => (
|
{Object.entries(circuitBreakerStatus).map(([key, status]) => (
|
||||||
<div key={key} className="flex justify-between text-sm">
|
<div key={key} className="flex justify-between text-sm">
|
||||||
<span>{key}:</span>
|
<span>{key}:</span>
|
||||||
<Badge
|
<Badge variant={!status.isOpen ? "default" : "destructive"}>
|
||||||
variant={status === "CLOSED" ? "default" : "destructive"}
|
{status.isOpen ? "OPEN" : "CLOSED"}
|
||||||
>
|
|
||||||
{status as string}
|
|
||||||
</Badge>
|
</Badge>
|
||||||
</div>
|
</div>
|
||||||
))}
|
))}
|
||||||
@ -412,13 +414,8 @@ export default function BatchMonitoringDashboard() {
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4 mb-6">
|
<div className="grid grid-cols-1 md:grid-cols-2 gap-4 mb-6">
|
||||||
<SystemHealthCard
|
<SystemHealthCard health={health} schedulerStatus={schedulerStatus} />
|
||||||
health={health}
|
<CircuitBreakerCard circuitBreakerStatus={circuitBreakerStatus} />
|
||||||
schedulerStatus={schedulerStatus as any}
|
|
||||||
/>
|
|
||||||
<CircuitBreakerCard
|
|
||||||
circuitBreakerStatus={circuitBreakerStatus as any}
|
|
||||||
/>
|
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
|||||||
@ -280,6 +280,84 @@ function addCORSHeaders(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Process authentication and authorization
|
||||||
|
*/
|
||||||
|
async function processAuthAndAuthz(
|
||||||
|
context: APIContext,
|
||||||
|
options: APIHandlerOptions
|
||||||
|
): Promise<void> {
|
||||||
|
if (options.requireAuth) {
|
||||||
|
await validateAuthentication(context);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (options.requiredRole || options.requirePlatformAccess) {
|
||||||
|
await validateAuthorization(context, options);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Process input validation
|
||||||
|
*/
|
||||||
|
async function processValidation(
|
||||||
|
request: NextRequest,
|
||||||
|
options: APIHandlerOptions
|
||||||
|
): Promise<{ validatedData: unknown; validatedQuery: unknown }> {
|
||||||
|
let validatedData: unknown;
|
||||||
|
if (options.validateInput && request.method !== "GET") {
|
||||||
|
validatedData = await validateInput(request, options.validateInput);
|
||||||
|
}
|
||||||
|
|
||||||
|
let validatedQuery: unknown;
|
||||||
|
if (options.validateQuery) {
|
||||||
|
validatedQuery = validateQuery(request, options.validateQuery);
|
||||||
|
}
|
||||||
|
|
||||||
|
return { validatedData, validatedQuery };
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create and configure response
|
||||||
|
*/
|
||||||
|
function createAPIResponse<T>(
|
||||||
|
result: T,
|
||||||
|
context: APIContext,
|
||||||
|
options: APIHandlerOptions
|
||||||
|
): NextResponse {
|
||||||
|
const response = NextResponse.json(
|
||||||
|
createSuccessResponse(result, { requestId: context.requestId })
|
||||||
|
);
|
||||||
|
|
||||||
|
response.headers.set("X-Request-ID", context.requestId);
|
||||||
|
|
||||||
|
if (options.cacheControl) {
|
||||||
|
response.headers.set("Cache-Control", options.cacheControl);
|
||||||
|
}
|
||||||
|
|
||||||
|
addCORSHeaders(response, options);
|
||||||
|
return response;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle request execution with audit logging
|
||||||
|
*/
|
||||||
|
async function executeWithAudit<T>(
|
||||||
|
handler: APIHandler<T>,
|
||||||
|
context: APIContext,
|
||||||
|
validatedData: unknown,
|
||||||
|
validatedQuery: unknown,
|
||||||
|
request: NextRequest,
|
||||||
|
options: APIHandlerOptions
|
||||||
|
): Promise<T> {
|
||||||
|
const result = await handler(context, validatedData, validatedQuery);
|
||||||
|
|
||||||
|
if (options.auditLog) {
|
||||||
|
await logAPIAccess(context, "success", request.url);
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Main API handler factory
|
* Main API handler factory
|
||||||
*/
|
*/
|
||||||
@ -291,64 +369,32 @@ export function createAPIHandler<T = unknown>(
|
|||||||
let context: APIContext | undefined;
|
let context: APIContext | undefined;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
// 1. Create request context
|
|
||||||
context = await createAPIContext(request);
|
context = await createAPIContext(request);
|
||||||
|
|
||||||
// 2. Apply rate limiting
|
|
||||||
if (options.rateLimit) {
|
if (options.rateLimit) {
|
||||||
await applyRateLimit(context, options.rateLimit);
|
await applyRateLimit(context, options.rateLimit);
|
||||||
}
|
}
|
||||||
|
|
||||||
// 3. Validate authentication
|
await processAuthAndAuthz(context, options);
|
||||||
if (options.requireAuth) {
|
|
||||||
await validateAuthentication(context);
|
|
||||||
}
|
|
||||||
|
|
||||||
// 4. Validate authorization
|
const { validatedData, validatedQuery } = await processValidation(
|
||||||
if (options.requiredRole || options.requirePlatformAccess) {
|
request,
|
||||||
await validateAuthorization(context, options);
|
options
|
||||||
}
|
|
||||||
|
|
||||||
// 5. Validate input
|
|
||||||
let validatedData;
|
|
||||||
if (options.validateInput && request.method !== "GET") {
|
|
||||||
validatedData = await validateInput(request, options.validateInput);
|
|
||||||
}
|
|
||||||
|
|
||||||
// 6. Validate query parameters
|
|
||||||
let validatedQuery;
|
|
||||||
if (options.validateQuery) {
|
|
||||||
validatedQuery = validateQuery(request, options.validateQuery);
|
|
||||||
}
|
|
||||||
|
|
||||||
// 7. Execute handler
|
|
||||||
const result = await handler(context, validatedData, validatedQuery);
|
|
||||||
|
|
||||||
// 8. Audit logging
|
|
||||||
if (options.auditLog) {
|
|
||||||
await logAPIAccess(context, "success", request.url);
|
|
||||||
}
|
|
||||||
|
|
||||||
// 9. Create response
|
|
||||||
const response = NextResponse.json(
|
|
||||||
createSuccessResponse(result, { requestId: context.requestId })
|
|
||||||
);
|
);
|
||||||
|
|
||||||
// 10. Add headers
|
const result = await executeWithAudit(
|
||||||
response.headers.set("X-Request-ID", context.requestId);
|
handler,
|
||||||
|
context,
|
||||||
|
validatedData,
|
||||||
|
validatedQuery,
|
||||||
|
request,
|
||||||
|
options
|
||||||
|
);
|
||||||
|
|
||||||
if (options.cacheControl) {
|
return createAPIResponse(result, context, options);
|
||||||
response.headers.set("Cache-Control", options.cacheControl);
|
|
||||||
}
|
|
||||||
|
|
||||||
addCORSHeaders(response, options);
|
|
||||||
|
|
||||||
return response;
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
// Handle errors consistently
|
|
||||||
const requestId = context?.requestId || crypto.randomUUID();
|
const requestId = context?.requestId || crypto.randomUUID();
|
||||||
|
|
||||||
// Log failed requests
|
|
||||||
if (options.auditLog && context) {
|
if (options.auditLog && context) {
|
||||||
await logAPIAccess(context, "error", request.url, error as Error);
|
await logAPIAccess(context, "error", request.url, error as Error);
|
||||||
}
|
}
|
||||||
|
|||||||
@ -137,7 +137,7 @@ export function stopOptimizedBatchScheduler(): void {
|
|||||||
{ task: retryFailedTask, name: "retryFailedTask" },
|
{ task: retryFailedTask, name: "retryFailedTask" },
|
||||||
];
|
];
|
||||||
|
|
||||||
for (const { task, name } of tasks) {
|
for (const { task } of tasks) {
|
||||||
if (task) {
|
if (task) {
|
||||||
task.stop();
|
task.stop();
|
||||||
task.destroy();
|
task.destroy();
|
||||||
|
|||||||
@ -169,6 +169,10 @@ const ConfigSchema = z.object({
|
|||||||
|
|
||||||
export type AppConfig = z.infer<typeof ConfigSchema>;
|
export type AppConfig = z.infer<typeof ConfigSchema>;
|
||||||
|
|
||||||
|
type DeepPartial<T> = {
|
||||||
|
[P in keyof T]?: T[P] extends object ? DeepPartial<T[P]> : T[P];
|
||||||
|
};
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Configuration provider class
|
* Configuration provider class
|
||||||
*/
|
*/
|
||||||
@ -230,8 +234,8 @@ class ConfigProvider {
|
|||||||
/**
|
/**
|
||||||
* Get environment-specific configuration
|
* Get environment-specific configuration
|
||||||
*/
|
*/
|
||||||
forEnvironment(env: Environment): Partial<AppConfig> {
|
forEnvironment(env: Environment): DeepPartial<AppConfig> {
|
||||||
const overrides: Record<Environment, any> = {
|
const overrides: Record<Environment, DeepPartial<AppConfig>> = {
|
||||||
development: {
|
development: {
|
||||||
app: {
|
app: {
|
||||||
logLevel: "debug",
|
logLevel: "debug",
|
||||||
@ -290,6 +294,169 @@ class ConfigProvider {
|
|||||||
return overrides[env] || {};
|
return overrides[env] || {};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract app configuration from environment
|
||||||
|
*/
|
||||||
|
private extractAppConfig(env: NodeJS.ProcessEnv, environment: Environment) {
|
||||||
|
return {
|
||||||
|
name: env.APP_NAME || "LiveDash",
|
||||||
|
version: env.APP_VERSION || "1.0.0",
|
||||||
|
environment,
|
||||||
|
baseUrl: env.NEXTAUTH_URL || "http://localhost:3000",
|
||||||
|
port: Number.parseInt(env.PORT || "3000", 10),
|
||||||
|
logLevel:
|
||||||
|
(env.LOG_LEVEL as "debug" | "info" | "warn" | "error") || "info",
|
||||||
|
features: {
|
||||||
|
enableMetrics: env.ENABLE_METRICS !== "false",
|
||||||
|
enableAnalytics: env.ENABLE_ANALYTICS !== "false",
|
||||||
|
enableCaching: env.ENABLE_CACHING !== "false",
|
||||||
|
enableCompression: env.ENABLE_COMPRESSION !== "false",
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract database configuration from environment
|
||||||
|
*/
|
||||||
|
private extractDatabaseConfig(env: NodeJS.ProcessEnv) {
|
||||||
|
return {
|
||||||
|
url: env.DATABASE_URL || "",
|
||||||
|
directUrl: env.DATABASE_URL_DIRECT,
|
||||||
|
maxConnections: Number.parseInt(env.DB_MAX_CONNECTIONS || "10", 10),
|
||||||
|
connectionTimeout: Number.parseInt(
|
||||||
|
env.DB_CONNECTION_TIMEOUT || "30000",
|
||||||
|
10
|
||||||
|
),
|
||||||
|
queryTimeout: Number.parseInt(env.DB_QUERY_TIMEOUT || "60000", 10),
|
||||||
|
retryAttempts: Number.parseInt(env.DB_RETRY_ATTEMPTS || "3", 10),
|
||||||
|
retryDelay: Number.parseInt(env.DB_RETRY_DELAY || "1000", 10),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract auth configuration from environment
|
||||||
|
*/
|
||||||
|
private extractAuthConfig(env: NodeJS.ProcessEnv) {
|
||||||
|
return {
|
||||||
|
secret: env.NEXTAUTH_SECRET || "",
|
||||||
|
url: env.NEXTAUTH_URL || "http://localhost:3000",
|
||||||
|
sessionMaxAge: Number.parseInt(env.AUTH_SESSION_MAX_AGE || "86400", 10),
|
||||||
|
providers: {
|
||||||
|
credentials: env.AUTH_CREDENTIALS_ENABLED !== "false",
|
||||||
|
github: env.AUTH_GITHUB_ENABLED === "true",
|
||||||
|
google: env.AUTH_GOOGLE_ENABLED === "true",
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract security configuration from environment
|
||||||
|
*/
|
||||||
|
private extractSecurityConfig(env: NodeJS.ProcessEnv) {
|
||||||
|
return {
|
||||||
|
csp: {
|
||||||
|
enabled: env.CSP_ENABLED !== "false",
|
||||||
|
reportUri: env.CSP_REPORT_URI,
|
||||||
|
reportOnly: env.CSP_REPORT_ONLY === "true",
|
||||||
|
},
|
||||||
|
csrf: {
|
||||||
|
enabled: env.CSRF_ENABLED !== "false",
|
||||||
|
tokenExpiry: Number.parseInt(env.CSRF_TOKEN_EXPIRY || "3600", 10),
|
||||||
|
},
|
||||||
|
rateLimit: {
|
||||||
|
enabled: env.RATE_LIMIT_ENABLED !== "false",
|
||||||
|
windowMs: Number.parseInt(env.RATE_LIMIT_WINDOW_MS || "900000", 10),
|
||||||
|
maxRequests: Number.parseInt(env.RATE_LIMIT_MAX_REQUESTS || "100", 10),
|
||||||
|
},
|
||||||
|
audit: {
|
||||||
|
enabled: env.AUDIT_ENABLED !== "false",
|
||||||
|
retentionDays: Number.parseInt(env.AUDIT_RETENTION_DAYS || "90", 10),
|
||||||
|
bufferSize: Number.parseInt(env.AUDIT_BUFFER_SIZE || "1000", 10),
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract OpenAI configuration from environment
|
||||||
|
*/
|
||||||
|
private extractOpenAIConfig(env: NodeJS.ProcessEnv) {
|
||||||
|
return {
|
||||||
|
apiKey: env.OPENAI_API_KEY || "",
|
||||||
|
organization: env.OPENAI_ORGANIZATION,
|
||||||
|
mockMode: env.OPENAI_MOCK_MODE === "true",
|
||||||
|
defaultModel: env.OPENAI_DEFAULT_MODEL || "gpt-3.5-turbo",
|
||||||
|
maxTokens: Number.parseInt(env.OPENAI_MAX_TOKENS || "1000", 10),
|
||||||
|
temperature: Number.parseFloat(env.OPENAI_TEMPERATURE || "0.1"),
|
||||||
|
batchConfig: {
|
||||||
|
enabled: env.OPENAI_BATCH_ENABLED !== "false",
|
||||||
|
maxRequestsPerBatch: Number.parseInt(
|
||||||
|
env.OPENAI_BATCH_MAX_REQUESTS || "1000",
|
||||||
|
10
|
||||||
|
),
|
||||||
|
statusCheckInterval: Number.parseInt(
|
||||||
|
env.OPENAI_BATCH_STATUS_INTERVAL || "60000",
|
||||||
|
10
|
||||||
|
),
|
||||||
|
maxTimeout: Number.parseInt(
|
||||||
|
env.OPENAI_BATCH_MAX_TIMEOUT || "86400000",
|
||||||
|
10
|
||||||
|
),
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract scheduler configuration from environment
|
||||||
|
*/
|
||||||
|
private extractSchedulerConfig(env: NodeJS.ProcessEnv) {
|
||||||
|
return {
|
||||||
|
enabled: env.SCHEDULER_ENABLED !== "false",
|
||||||
|
csvImport: {
|
||||||
|
enabled: env.CSV_IMPORT_SCHEDULER_ENABLED !== "false",
|
||||||
|
interval: env.CSV_IMPORT_INTERVAL || "*/5 * * * *",
|
||||||
|
},
|
||||||
|
importProcessor: {
|
||||||
|
enabled: env.IMPORT_PROCESSOR_ENABLED !== "false",
|
||||||
|
interval: env.IMPORT_PROCESSOR_INTERVAL || "*/2 * * * *",
|
||||||
|
},
|
||||||
|
sessionProcessor: {
|
||||||
|
enabled: env.SESSION_PROCESSOR_ENABLED !== "false",
|
||||||
|
interval: env.SESSION_PROCESSOR_INTERVAL || "*/3 * * * *",
|
||||||
|
batchSize: Number.parseInt(
|
||||||
|
env.SESSION_PROCESSOR_BATCH_SIZE || "50",
|
||||||
|
10
|
||||||
|
),
|
||||||
|
},
|
||||||
|
batchProcessor: {
|
||||||
|
enabled: env.BATCH_PROCESSOR_ENABLED !== "false",
|
||||||
|
createInterval: env.BATCH_CREATE_INTERVAL || "*/5 * * * *",
|
||||||
|
statusInterval: env.BATCH_STATUS_INTERVAL || "*/2 * * * *",
|
||||||
|
resultInterval: env.BATCH_RESULT_INTERVAL || "*/1 * * * *",
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract email configuration from environment
|
||||||
|
*/
|
||||||
|
private extractEmailConfig(env: NodeJS.ProcessEnv) {
|
||||||
|
return {
|
||||||
|
enabled: env.EMAIL_ENABLED === "true",
|
||||||
|
smtp: {
|
||||||
|
host: env.SMTP_HOST,
|
||||||
|
port: Number.parseInt(env.SMTP_PORT || "587", 10),
|
||||||
|
secure: env.SMTP_SECURE === "true",
|
||||||
|
user: env.SMTP_USER,
|
||||||
|
password: env.SMTP_PASSWORD,
|
||||||
|
},
|
||||||
|
from: env.EMAIL_FROM || "noreply@livedash.com",
|
||||||
|
templates: {
|
||||||
|
passwordReset: env.EMAIL_TEMPLATE_PASSWORD_RESET || "password-reset",
|
||||||
|
userInvitation: env.EMAIL_TEMPLATE_USER_INVITATION || "user-invitation",
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Extract configuration from environment variables
|
* Extract configuration from environment variables
|
||||||
*/
|
*/
|
||||||
@ -298,130 +465,13 @@ class ConfigProvider {
|
|||||||
const environment = (env.NODE_ENV as Environment) || "development";
|
const environment = (env.NODE_ENV as Environment) || "development";
|
||||||
|
|
||||||
return {
|
return {
|
||||||
app: {
|
app: this.extractAppConfig(env, environment),
|
||||||
name: env.APP_NAME || "LiveDash",
|
database: this.extractDatabaseConfig(env),
|
||||||
version: env.APP_VERSION || "1.0.0",
|
auth: this.extractAuthConfig(env),
|
||||||
environment,
|
security: this.extractSecurityConfig(env),
|
||||||
baseUrl: env.NEXTAUTH_URL || "http://localhost:3000",
|
openai: this.extractOpenAIConfig(env),
|
||||||
port: Number.parseInt(env.PORT || "3000", 10),
|
scheduler: this.extractSchedulerConfig(env),
|
||||||
logLevel: (env.LOG_LEVEL as any) || "info",
|
email: this.extractEmailConfig(env),
|
||||||
features: {
|
|
||||||
enableMetrics: env.ENABLE_METRICS !== "false",
|
|
||||||
enableAnalytics: env.ENABLE_ANALYTICS !== "false",
|
|
||||||
enableCaching: env.ENABLE_CACHING !== "false",
|
|
||||||
enableCompression: env.ENABLE_COMPRESSION !== "false",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
database: {
|
|
||||||
url: env.DATABASE_URL || "",
|
|
||||||
directUrl: env.DATABASE_URL_DIRECT,
|
|
||||||
maxConnections: Number.parseInt(env.DB_MAX_CONNECTIONS || "10", 10),
|
|
||||||
connectionTimeout: Number.parseInt(
|
|
||||||
env.DB_CONNECTION_TIMEOUT || "30000",
|
|
||||||
10
|
|
||||||
),
|
|
||||||
queryTimeout: Number.parseInt(env.DB_QUERY_TIMEOUT || "60000", 10),
|
|
||||||
retryAttempts: Number.parseInt(env.DB_RETRY_ATTEMPTS || "3", 10),
|
|
||||||
retryDelay: Number.parseInt(env.DB_RETRY_DELAY || "1000", 10),
|
|
||||||
},
|
|
||||||
auth: {
|
|
||||||
secret: env.NEXTAUTH_SECRET || "",
|
|
||||||
url: env.NEXTAUTH_URL || "http://localhost:3000",
|
|
||||||
sessionMaxAge: Number.parseInt(env.AUTH_SESSION_MAX_AGE || "86400", 10),
|
|
||||||
providers: {
|
|
||||||
credentials: env.AUTH_CREDENTIALS_ENABLED !== "false",
|
|
||||||
github: env.AUTH_GITHUB_ENABLED === "true",
|
|
||||||
google: env.AUTH_GOOGLE_ENABLED === "true",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
security: {
|
|
||||||
csp: {
|
|
||||||
enabled: env.CSP_ENABLED !== "false",
|
|
||||||
reportUri: env.CSP_REPORT_URI,
|
|
||||||
reportOnly: env.CSP_REPORT_ONLY === "true",
|
|
||||||
},
|
|
||||||
csrf: {
|
|
||||||
enabled: env.CSRF_ENABLED !== "false",
|
|
||||||
tokenExpiry: Number.parseInt(env.CSRF_TOKEN_EXPIRY || "3600", 10),
|
|
||||||
},
|
|
||||||
rateLimit: {
|
|
||||||
enabled: env.RATE_LIMIT_ENABLED !== "false",
|
|
||||||
windowMs: Number.parseInt(env.RATE_LIMIT_WINDOW_MS || "900000", 10),
|
|
||||||
maxRequests: Number.parseInt(
|
|
||||||
env.RATE_LIMIT_MAX_REQUESTS || "100",
|
|
||||||
10
|
|
||||||
),
|
|
||||||
},
|
|
||||||
audit: {
|
|
||||||
enabled: env.AUDIT_ENABLED !== "false",
|
|
||||||
retentionDays: Number.parseInt(env.AUDIT_RETENTION_DAYS || "90", 10),
|
|
||||||
bufferSize: Number.parseInt(env.AUDIT_BUFFER_SIZE || "1000", 10),
|
|
||||||
},
|
|
||||||
},
|
|
||||||
openai: {
|
|
||||||
apiKey: env.OPENAI_API_KEY || "",
|
|
||||||
organization: env.OPENAI_ORGANIZATION,
|
|
||||||
mockMode: env.OPENAI_MOCK_MODE === "true",
|
|
||||||
defaultModel: env.OPENAI_DEFAULT_MODEL || "gpt-3.5-turbo",
|
|
||||||
maxTokens: Number.parseInt(env.OPENAI_MAX_TOKENS || "1000", 10),
|
|
||||||
temperature: Number.parseFloat(env.OPENAI_TEMPERATURE || "0.1"),
|
|
||||||
batchConfig: {
|
|
||||||
enabled: env.OPENAI_BATCH_ENABLED !== "false",
|
|
||||||
maxRequestsPerBatch: Number.parseInt(
|
|
||||||
env.OPENAI_BATCH_MAX_REQUESTS || "1000",
|
|
||||||
10
|
|
||||||
),
|
|
||||||
statusCheckInterval: Number.parseInt(
|
|
||||||
env.OPENAI_BATCH_STATUS_INTERVAL || "60000",
|
|
||||||
10
|
|
||||||
),
|
|
||||||
maxTimeout: Number.parseInt(
|
|
||||||
env.OPENAI_BATCH_MAX_TIMEOUT || "86400000",
|
|
||||||
10
|
|
||||||
),
|
|
||||||
},
|
|
||||||
},
|
|
||||||
scheduler: {
|
|
||||||
enabled: env.SCHEDULER_ENABLED !== "false",
|
|
||||||
csvImport: {
|
|
||||||
enabled: env.CSV_IMPORT_SCHEDULER_ENABLED !== "false",
|
|
||||||
interval: env.CSV_IMPORT_INTERVAL || "*/5 * * * *",
|
|
||||||
},
|
|
||||||
importProcessor: {
|
|
||||||
enabled: env.IMPORT_PROCESSOR_ENABLED !== "false",
|
|
||||||
interval: env.IMPORT_PROCESSOR_INTERVAL || "*/2 * * * *",
|
|
||||||
},
|
|
||||||
sessionProcessor: {
|
|
||||||
enabled: env.SESSION_PROCESSOR_ENABLED !== "false",
|
|
||||||
interval: env.SESSION_PROCESSOR_INTERVAL || "*/3 * * * *",
|
|
||||||
batchSize: Number.parseInt(
|
|
||||||
env.SESSION_PROCESSOR_BATCH_SIZE || "50",
|
|
||||||
10
|
|
||||||
),
|
|
||||||
},
|
|
||||||
batchProcessor: {
|
|
||||||
enabled: env.BATCH_PROCESSOR_ENABLED !== "false",
|
|
||||||
createInterval: env.BATCH_CREATE_INTERVAL || "*/5 * * * *",
|
|
||||||
statusInterval: env.BATCH_STATUS_INTERVAL || "*/2 * * * *",
|
|
||||||
resultInterval: env.BATCH_RESULT_INTERVAL || "*/1 * * * *",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
email: {
|
|
||||||
enabled: env.EMAIL_ENABLED === "true",
|
|
||||||
smtp: {
|
|
||||||
host: env.SMTP_HOST,
|
|
||||||
port: Number.parseInt(env.SMTP_PORT || "587", 10),
|
|
||||||
secure: env.SMTP_SECURE === "true",
|
|
||||||
user: env.SMTP_USER,
|
|
||||||
password: env.SMTP_PASSWORD,
|
|
||||||
},
|
|
||||||
from: env.EMAIL_FROM || "noreply@livedash.com",
|
|
||||||
templates: {
|
|
||||||
passwordReset: env.EMAIL_TEMPLATE_PASSWORD_RESET || "password-reset",
|
|
||||||
userInvitation:
|
|
||||||
env.EMAIL_TEMPLATE_USER_INVITATION || "user-invitation",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@ -191,7 +191,7 @@ export const DynamicAuditLogsPanel = createDynamicComponent(
|
|||||||
|
|
||||||
// React wrapper for React.lazy with Suspense
|
// React wrapper for React.lazy with Suspense
|
||||||
export function createLazyComponent<
|
export function createLazyComponent<
|
||||||
T extends Record<string, any> = Record<string, any>,
|
T extends Record<string, unknown> = Record<string, unknown>,
|
||||||
>(
|
>(
|
||||||
importFunc: () => Promise<{ default: ComponentType<T> }>,
|
importFunc: () => Promise<{ default: ComponentType<T> }>,
|
||||||
fallback: ComponentType = LoadingSpinner
|
fallback: ComponentType = LoadingSpinner
|
||||||
|
|||||||
@ -15,6 +15,26 @@ import {
|
|||||||
type MockResponseType,
|
type MockResponseType,
|
||||||
} from "./openai-responses";
|
} from "./openai-responses";
|
||||||
|
|
||||||
|
interface ChatCompletionParams {
|
||||||
|
model: string;
|
||||||
|
messages: Array<{ role: string; content: string }>;
|
||||||
|
temperature?: number;
|
||||||
|
max_tokens?: number;
|
||||||
|
[key: string]: unknown;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface BatchCreateParams {
|
||||||
|
input_file_id: string;
|
||||||
|
endpoint: string;
|
||||||
|
completion_window: string;
|
||||||
|
metadata?: Record<string, unknown>;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface FileCreateParams {
|
||||||
|
file: string; // File content as string for mock purposes
|
||||||
|
purpose: string;
|
||||||
|
}
|
||||||
|
|
||||||
interface MockOpenAIConfig {
|
interface MockOpenAIConfig {
|
||||||
enabled: boolean;
|
enabled: boolean;
|
||||||
baseDelay: number; // Base delay in ms to simulate API latency
|
baseDelay: number; // Base delay in ms to simulate API latency
|
||||||
@ -115,12 +135,9 @@ class OpenAIMockServer {
|
|||||||
/**
|
/**
|
||||||
* Mock chat completions endpoint
|
* Mock chat completions endpoint
|
||||||
*/
|
*/
|
||||||
async mockChatCompletion(request: {
|
async mockChatCompletion(
|
||||||
model: string;
|
request: ChatCompletionParams
|
||||||
messages: Array<{ role: string; content: string }>;
|
): Promise<MockChatCompletion> {
|
||||||
temperature?: number;
|
|
||||||
max_tokens?: number;
|
|
||||||
}): Promise<MockChatCompletion> {
|
|
||||||
this.requestCount++;
|
this.requestCount++;
|
||||||
|
|
||||||
await this.simulateDelay();
|
await this.simulateDelay();
|
||||||
@ -172,12 +189,9 @@ class OpenAIMockServer {
|
|||||||
/**
|
/**
|
||||||
* Mock batch creation endpoint
|
* Mock batch creation endpoint
|
||||||
*/
|
*/
|
||||||
async mockCreateBatch(request: {
|
async mockCreateBatch(
|
||||||
input_file_id: string;
|
request: BatchCreateParams
|
||||||
endpoint: string;
|
): Promise<MockBatchResponse> {
|
||||||
completion_window: string;
|
|
||||||
metadata?: Record<string, string>;
|
|
||||||
}): Promise<MockBatchResponse> {
|
|
||||||
await this.simulateDelay();
|
await this.simulateDelay();
|
||||||
|
|
||||||
if (this.shouldSimulateError()) {
|
if (this.shouldSimulateError()) {
|
||||||
@ -214,10 +228,7 @@ class OpenAIMockServer {
|
|||||||
/**
|
/**
|
||||||
* Mock file upload endpoint
|
* Mock file upload endpoint
|
||||||
*/
|
*/
|
||||||
async mockUploadFile(request: {
|
async mockUploadFile(request: FileCreateParams): Promise<{
|
||||||
file: string; // File content
|
|
||||||
purpose: string;
|
|
||||||
}): Promise<{
|
|
||||||
id: string;
|
id: string;
|
||||||
object: string;
|
object: string;
|
||||||
purpose: string;
|
purpose: string;
|
||||||
@ -364,23 +375,42 @@ export const openAIMock = new OpenAIMockServer();
|
|||||||
/**
|
/**
|
||||||
* Drop-in replacement for OpenAI client that uses mocks when enabled
|
* Drop-in replacement for OpenAI client that uses mocks when enabled
|
||||||
*/
|
*/
|
||||||
export class MockOpenAIClient {
|
interface OpenAIClient {
|
||||||
private realClient: unknown;
|
chat: {
|
||||||
|
completions: {
|
||||||
|
create: (params: ChatCompletionParams) => Promise<MockChatCompletion>;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
batches: {
|
||||||
|
create: (params: BatchCreateParams) => Promise<MockBatchResponse>;
|
||||||
|
retrieve: (batchId: string) => Promise<MockBatchResponse>;
|
||||||
|
};
|
||||||
|
files: {
|
||||||
|
create: (params: FileCreateParams) => Promise<{
|
||||||
|
id: string;
|
||||||
|
object: string;
|
||||||
|
purpose: string;
|
||||||
|
filename: string;
|
||||||
|
}>;
|
||||||
|
content: (fileId: string) => Promise<string>;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
constructor(realClient: unknown) {
|
export class MockOpenAIClient {
|
||||||
|
private realClient: OpenAIClient;
|
||||||
|
|
||||||
|
constructor(realClient: OpenAIClient) {
|
||||||
this.realClient = realClient;
|
this.realClient = realClient;
|
||||||
}
|
}
|
||||||
|
|
||||||
get chat() {
|
get chat() {
|
||||||
return {
|
return {
|
||||||
completions: {
|
completions: {
|
||||||
create: async (params: any) => {
|
create: async (params: ChatCompletionParams) => {
|
||||||
if (openAIMock.isEnabled()) {
|
if (openAIMock.isEnabled()) {
|
||||||
return openAIMock.mockChatCompletion(params as any);
|
return openAIMock.mockChatCompletion(params);
|
||||||
}
|
}
|
||||||
return (this.realClient as any).chat.completions.create(
|
return this.realClient.chat.completions.create(params);
|
||||||
params as any
|
|
||||||
);
|
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
@ -388,34 +418,34 @@ export class MockOpenAIClient {
|
|||||||
|
|
||||||
get batches() {
|
get batches() {
|
||||||
return {
|
return {
|
||||||
create: async (params: any) => {
|
create: async (params: BatchCreateParams) => {
|
||||||
if (openAIMock.isEnabled()) {
|
if (openAIMock.isEnabled()) {
|
||||||
return openAIMock.mockCreateBatch(params as any);
|
return openAIMock.mockCreateBatch(params);
|
||||||
}
|
}
|
||||||
return (this.realClient as any).batches.create(params as any);
|
return this.realClient.batches.create(params);
|
||||||
},
|
},
|
||||||
retrieve: async (batchId: string) => {
|
retrieve: async (batchId: string) => {
|
||||||
if (openAIMock.isEnabled()) {
|
if (openAIMock.isEnabled()) {
|
||||||
return openAIMock.mockGetBatch(batchId);
|
return openAIMock.mockGetBatch(batchId);
|
||||||
}
|
}
|
||||||
return (this.realClient as any).batches.retrieve(batchId);
|
return this.realClient.batches.retrieve(batchId);
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
get files() {
|
get files() {
|
||||||
return {
|
return {
|
||||||
create: async (params: any) => {
|
create: async (params: FileCreateParams) => {
|
||||||
if (openAIMock.isEnabled()) {
|
if (openAIMock.isEnabled()) {
|
||||||
return openAIMock.mockUploadFile(params);
|
return openAIMock.mockUploadFile(params);
|
||||||
}
|
}
|
||||||
return (this.realClient as any).files.create(params);
|
return this.realClient.files.create(params);
|
||||||
},
|
},
|
||||||
content: async (fileId: string) => {
|
content: async (fileId: string) => {
|
||||||
if (openAIMock.isEnabled()) {
|
if (openAIMock.isEnabled()) {
|
||||||
return openAIMock.mockGetFileContent(fileId);
|
return openAIMock.mockGetFileContent(fileId);
|
||||||
}
|
}
|
||||||
return (this.realClient as any).files.content(fileId);
|
return this.realClient.files.content(fileId);
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|||||||
@ -181,7 +181,8 @@ class PerformanceMonitor {
|
|||||||
// Placeholder for analytics integration
|
// Placeholder for analytics integration
|
||||||
// You could send this to Google Analytics, Vercel Analytics, etc.
|
// You could send this to Google Analytics, Vercel Analytics, etc.
|
||||||
if (typeof window !== "undefined" && "gtag" in window) {
|
if (typeof window !== "undefined" && "gtag" in window) {
|
||||||
(window as any).gtag("event", "core_web_vital", {
|
const gtag = (window as { gtag?: (...args: unknown[]) => void }).gtag;
|
||||||
|
gtag?.("event", "core_web_vital", {
|
||||||
name: metricName,
|
name: metricName,
|
||||||
value: Math.round(value),
|
value: Math.round(value),
|
||||||
metric_rating: this.getRating(metricName, value),
|
metric_rating: this.getRating(metricName, value),
|
||||||
|
|||||||
@ -169,7 +169,7 @@ export class PerformanceCache<K extends {} = string, V = unknown> {
|
|||||||
/**
|
/**
|
||||||
* Memoize a function with caching
|
* Memoize a function with caching
|
||||||
*/
|
*/
|
||||||
memoize<Args extends any[], Return extends V>(
|
memoize<Args extends unknown[], Return extends V>(
|
||||||
fn: (...args: Args) => Promise<Return> | Return,
|
fn: (...args: Args) => Promise<Return> | Return,
|
||||||
keyGenerator?: (...args: Args) => K,
|
keyGenerator?: (...args: Args) => K,
|
||||||
ttl?: number
|
ttl?: number
|
||||||
@ -421,7 +421,7 @@ export class CacheUtils {
|
|||||||
/**
|
/**
|
||||||
* Cache the result of an async function
|
* Cache the result of an async function
|
||||||
*/
|
*/
|
||||||
static cached<T extends any[], R>(
|
static cached<T extends unknown[], R>(
|
||||||
cacheName: string,
|
cacheName: string,
|
||||||
fn: (...args: T) => Promise<R>,
|
fn: (...args: T) => Promise<R>,
|
||||||
options: CacheOptions & {
|
options: CacheOptions & {
|
||||||
|
|||||||
@ -155,34 +155,36 @@ export class RequestDeduplicator {
|
|||||||
}> = [];
|
}> = [];
|
||||||
|
|
||||||
// Create the main promise
|
// Create the main promise
|
||||||
const promise = new Promise<T>(async (resolve, reject) => {
|
const promise = new Promise<T>((resolve, reject) => {
|
||||||
resolvers.push({ resolve, reject });
|
resolvers.push({ resolve, reject });
|
||||||
|
|
||||||
try {
|
// Execute the async function
|
||||||
const result = await fn();
|
fn()
|
||||||
|
.then((result) => {
|
||||||
|
// Cache the result
|
||||||
|
if (options.ttl && options.ttl > 0) {
|
||||||
|
this.results.set(key, {
|
||||||
|
value: result,
|
||||||
|
timestamp: Date.now(),
|
||||||
|
ttl: options.ttl,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
// Cache the result
|
// Resolve all waiting promises
|
||||||
if (options.ttl && options.ttl > 0) {
|
resolvers.forEach(({ resolve: res }) => res(result));
|
||||||
this.results.set(key, {
|
})
|
||||||
value: result,
|
.catch((error) => {
|
||||||
timestamp: Date.now(),
|
this.stats.errors++;
|
||||||
ttl: options.ttl,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Resolve all waiting promises
|
// Reject all waiting promises
|
||||||
resolvers.forEach(({ resolve: res }) => res(result));
|
const errorToReject =
|
||||||
} catch (error) {
|
error instanceof Error ? error : new Error(String(error));
|
||||||
this.stats.errors++;
|
resolvers.forEach(({ reject: rej }) => rej(errorToReject));
|
||||||
|
})
|
||||||
// Reject all waiting promises
|
.finally(() => {
|
||||||
const errorToReject =
|
// Clean up pending request
|
||||||
error instanceof Error ? error : new Error(String(error));
|
this.pendingRequests.delete(key);
|
||||||
resolvers.forEach(({ reject: rej }) => rej(errorToReject));
|
});
|
||||||
} finally {
|
|
||||||
// Clean up pending request
|
|
||||||
this.pendingRequests.delete(key);
|
|
||||||
}
|
|
||||||
});
|
});
|
||||||
|
|
||||||
// Set up timeout if specified
|
// Set up timeout if specified
|
||||||
|
|||||||
@ -167,7 +167,11 @@ function startMonitoringIfEnabled(enabled?: boolean): void {
|
|||||||
/**
|
/**
|
||||||
* Helper function to record request metrics if enabled
|
* Helper function to record request metrics if enabled
|
||||||
*/
|
*/
|
||||||
function recordRequestIfEnabled(timer: ReturnType<typeof PerformanceUtils.createTimer>, isError: boolean, enabled?: boolean): void {
|
function recordRequestIfEnabled(
|
||||||
|
timer: ReturnType<typeof PerformanceUtils.createTimer>,
|
||||||
|
isError: boolean,
|
||||||
|
enabled?: boolean
|
||||||
|
): void {
|
||||||
if (enabled) {
|
if (enabled) {
|
||||||
performanceMonitor.recordRequest(timer.end(), isError);
|
performanceMonitor.recordRequest(timer.end(), isError);
|
||||||
}
|
}
|
||||||
@ -185,7 +189,7 @@ async function executeRequestWithOptimizations(
|
|||||||
if (opts.cache?.enabled || opts.deduplication?.enabled) {
|
if (opts.cache?.enabled || opts.deduplication?.enabled) {
|
||||||
return executeWithCacheOrDeduplication(req, opts, originalHandler);
|
return executeWithCacheOrDeduplication(req, opts, originalHandler);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Direct execution with monitoring
|
// Direct execution with monitoring
|
||||||
const { result } = await PerformanceUtils.measureAsync(routeName, () =>
|
const { result } = await PerformanceUtils.measureAsync(routeName, () =>
|
||||||
originalHandler(req)
|
originalHandler(req)
|
||||||
@ -216,18 +220,16 @@ async function executeWithCacheOrDeduplication(
|
|||||||
opts.cache.ttl
|
opts.cache.ttl
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Deduplication only
|
// Deduplication only
|
||||||
const deduplicator =
|
const deduplicator =
|
||||||
deduplicators[
|
deduplicators[
|
||||||
opts.deduplication?.deduplicatorName as keyof typeof deduplicators
|
opts.deduplication?.deduplicatorName as keyof typeof deduplicators
|
||||||
] || deduplicators.api;
|
] || deduplicators.api;
|
||||||
|
|
||||||
return deduplicator.execute(
|
return deduplicator.execute(cacheKey, () => originalHandler(req), {
|
||||||
cacheKey,
|
ttl: opts.deduplication?.ttl,
|
||||||
() => originalHandler(req),
|
});
|
||||||
{ ttl: opts.deduplication?.ttl }
|
|
||||||
);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@ -247,7 +249,12 @@ export function enhanceAPIRoute(
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
startMonitoringIfEnabled(opts.monitoring?.enabled);
|
startMonitoringIfEnabled(opts.monitoring?.enabled);
|
||||||
const response = await executeRequestWithOptimizations(req, opts, routeName, originalHandler);
|
const response = await executeRequestWithOptimizations(
|
||||||
|
req,
|
||||||
|
opts,
|
||||||
|
routeName,
|
||||||
|
originalHandler
|
||||||
|
);
|
||||||
recordRequestIfEnabled(timer, false, opts.monitoring?.recordRequests);
|
recordRequestIfEnabled(timer, false, opts.monitoring?.recordRequests);
|
||||||
return response;
|
return response;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
@ -263,8 +270,10 @@ export function enhanceAPIRoute(
|
|||||||
export function PerformanceEnhanced(
|
export function PerformanceEnhanced(
|
||||||
options: PerformanceIntegrationOptions = {}
|
options: PerformanceIntegrationOptions = {}
|
||||||
) {
|
) {
|
||||||
return <T extends new (...args: any[]) => {}>(constructor: T) =>
|
// biome-ignore lint/suspicious/noExplicitAny: Required for mixin class pattern - TypeScript requires any[] for constructor parameters in mixins
|
||||||
class extends constructor {
|
return <T extends new (...args: any[]) => {}>(Constructor: T) =>
|
||||||
|
class extends Constructor {
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Required for mixin class pattern - TypeScript requires any[] for constructor parameters in mixins
|
||||||
constructor(...args: any[]) {
|
constructor(...args: any[]) {
|
||||||
super(...args);
|
super(...args);
|
||||||
|
|
||||||
@ -279,7 +288,7 @@ export function PerformanceEnhanced(
|
|||||||
if (typeof originalMethod === "function") {
|
if (typeof originalMethod === "function") {
|
||||||
(this as Record<string, unknown>)[methodName] =
|
(this as Record<string, unknown>)[methodName] =
|
||||||
enhanceServiceMethod(
|
enhanceServiceMethod(
|
||||||
`${constructor.name}.${methodName}`,
|
`${Constructor.name}.${methodName}`,
|
||||||
originalMethod.bind(this),
|
originalMethod.bind(this),
|
||||||
options
|
options
|
||||||
);
|
);
|
||||||
|
|||||||
@ -777,9 +777,8 @@ export class PerformanceUtils {
|
|||||||
}
|
}
|
||||||
|
|
||||||
descriptor.value = async function (...args: unknown[]) {
|
descriptor.value = async function (...args: unknown[]) {
|
||||||
const { result } = await PerformanceUtils.measureAsync(
|
const { result } = await PerformanceUtils.measureAsync(metricName, () =>
|
||||||
metricName,
|
originalMethod.apply(this, args)
|
||||||
() => originalMethod.apply(this, args)
|
|
||||||
);
|
);
|
||||||
return result;
|
return result;
|
||||||
};
|
};
|
||||||
|
|||||||
18
package.json
18
package.json
@ -8,14 +8,18 @@
|
|||||||
"build:analyze": "ANALYZE=true next build",
|
"build:analyze": "ANALYZE=true next build",
|
||||||
"dev": "pnpm exec tsx server.ts",
|
"dev": "pnpm exec tsx server.ts",
|
||||||
"dev:next-only": "next dev --turbopack",
|
"dev:next-only": "next dev --turbopack",
|
||||||
"format": "npx prettier --write .",
|
"format": "pnpm format:prettier && pnpm format:biome",
|
||||||
"format:check": "npx prettier --check .",
|
"format:check": "pnpm format:check-prettier && pnpm format:check-biome",
|
||||||
|
"format:biome": "biome format --write",
|
||||||
|
"format:check-biome": "biome format",
|
||||||
|
"format:prettier": "npx prettier --write .",
|
||||||
|
"format:check-prettier": "npx prettier --check .",
|
||||||
"lint": "next lint",
|
"lint": "next lint",
|
||||||
"lint:fix": "npx eslint --fix",
|
"lint:fix": "npx eslint --fix",
|
||||||
"biome:check": "biome check .",
|
"biome:check": "biome check",
|
||||||
"biome:fix": "biome check --write .",
|
"biome:fix": "biome check --write",
|
||||||
"biome:format": "biome format --write .",
|
"biome:format": "biome format --write",
|
||||||
"biome:lint": "biome lint .",
|
"biome:lint": "biome lint",
|
||||||
"prisma:generate": "prisma generate",
|
"prisma:generate": "prisma generate",
|
||||||
"prisma:migrate": "prisma migrate dev",
|
"prisma:migrate": "prisma migrate dev",
|
||||||
"prisma:seed": "pnpm exec tsx prisma/seed.ts",
|
"prisma:seed": "pnpm exec tsx prisma/seed.ts",
|
||||||
@ -37,6 +41,8 @@
|
|||||||
"test:csp:full": "pnpm test:csp && pnpm test:csp:validate && pnpm test:vitest tests/unit/enhanced-csp.test.ts tests/integration/csp-middleware.test.ts tests/integration/csp-report-endpoint.test.ts",
|
"test:csp:full": "pnpm test:csp && pnpm test:csp:validate && pnpm test:vitest tests/unit/enhanced-csp.test.ts tests/integration/csp-middleware.test.ts tests/integration/csp-report-endpoint.test.ts",
|
||||||
"lint:md": "markdownlint-cli2 \"**/*.md\" \"!.trunk/**\" \"!.venv/**\" \"!node_modules/**\"",
|
"lint:md": "markdownlint-cli2 \"**/*.md\" \"!.trunk/**\" \"!.venv/**\" \"!node_modules/**\"",
|
||||||
"lint:md:fix": "markdownlint-cli2 --fix \"**/*.md\" \"!.trunk/**\" \"!.venv/**\" \"!node_modules/**\"",
|
"lint:md:fix": "markdownlint-cli2 --fix \"**/*.md\" \"!.trunk/**\" \"!.venv/**\" \"!node_modules/**\"",
|
||||||
|
"lint:ci-warning": "biome ci --diagnostic-level=warn",
|
||||||
|
"lint:ci-error": "biome ci --diagnostic-level=error",
|
||||||
"migration:backup": "pnpm exec tsx scripts/migration/backup-database.ts full",
|
"migration:backup": "pnpm exec tsx scripts/migration/backup-database.ts full",
|
||||||
"migration:backup:schema": "pnpm exec tsx scripts/migration/backup-database.ts schema",
|
"migration:backup:schema": "pnpm exec tsx scripts/migration/backup-database.ts schema",
|
||||||
"migration:backup:data": "pnpm exec tsx scripts/migration/backup-database.ts data",
|
"migration:backup:data": "pnpm exec tsx scripts/migration/backup-database.ts data",
|
||||||
|
|||||||
10139
pnpm-lock.yaml
generated
10139
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
@ -15,10 +15,10 @@ The migration will be performed incrementally to minimize disruption. We will st
|
|||||||
|
|
||||||
### Why tRPC?
|
### Why tRPC?
|
||||||
|
|
||||||
- **End-to-End Type Safety:** Eliminates a class of runtime errors by ensuring the client and server conform to the same data contracts. TypeScript errors will appear at build time if the client and server are out of sync.
|
- **End-to-End Type Safety:** Eliminates a class of runtime errors by ensuring the client and server conform to the same data contracts. TypeScript errors will appear at build time if the client and server are out of sync.
|
||||||
- **Improved Developer Experience:** Provides autocompletion for API procedures and their data types directly in the editor.
|
- **Improved Developer Experience:** Provides autocompletion for API procedures and their data types directly in the editor.
|
||||||
- **Simplified Data Fetching:** Replaces manual `fetch` calls and `useEffect` hooks with clean, declarative tRPC hooks (`useQuery`, `useMutation`).
|
- **Simplified Data Fetching:** Replaces manual `fetch` calls and `useEffect` hooks with clean, declarative tRPC hooks (`useQuery`, `useMutation`).
|
||||||
- **No Code Generation:** Leverages TypeScript inference, avoiding a separate schema definition or code generation step.
|
- **No Code Generation:** Leverages TypeScript inference, avoiding a separate schema definition or code generation step.
|
||||||
|
|
||||||
### Integration Strategy: Gradual Adoption
|
### Integration Strategy: Gradual Adoption
|
||||||
|
|
||||||
@ -240,11 +240,11 @@ export default function UsersPage() {
|
|||||||
|
|
||||||
## 4. Next Steps & Future Enhancements
|
## 4. Next Steps & Future Enhancements
|
||||||
|
|
||||||
- **Authentication & Context:** Implement a `createContext` function to pass session data (e.g., from NextAuth.js) to your tRPC procedures. This will allow for protected procedures.
|
- **Authentication & Context:** Implement a `createContext` function to pass session data (e.g., from NextAuth.js) to your tRPC procedures. This will allow for protected procedures.
|
||||||
- **Input Validation:** Extensively use `zod` in the `.input()` part of procedures to validate all incoming data.
|
- **Input Validation:** Extensively use `zod` in the `.input()` part of procedures to validate all incoming data.
|
||||||
- **Error Handling:** Implement robust error handling on both the client and server.
|
- **Error Handling:** Implement robust error handling on both the client and server.
|
||||||
- **Mutations:** Begin migrating `POST`, `PUT`, and `DELETE` endpoints to tRPC mutations.
|
- **Mutations:** Begin migrating `POST`, `PUT`, and `DELETE` endpoints to tRPC mutations.
|
||||||
- **Optimistic UI:** For mutations, implement optimistic updates to provide a faster user experience.
|
- **Optimistic UI:** For mutations, implement optimistic updates to provide a faster user experience.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@ -6,33 +6,33 @@ This directory contains comprehensive migration scripts for deploying the new ar
|
|||||||
|
|
||||||
### 1. Database Migrations
|
### 1. Database Migrations
|
||||||
|
|
||||||
- `01-schema-migrations.sql` - Prisma database schema migrations
|
- `01-schema-migrations.sql` - Prisma database schema migrations
|
||||||
- `02-data-migrations.sql` - Data transformation scripts
|
- `02-data-migrations.sql` - Data transformation scripts
|
||||||
- `validate-database.ts` - Database validation and health checks
|
- `validate-database.ts` - Database validation and health checks
|
||||||
|
|
||||||
### 2. Environment Configuration
|
### 2. Environment Configuration
|
||||||
|
|
||||||
- `environment-migration.ts` - Environment variable migration guide
|
- `environment-migration.ts` - Environment variable migration guide
|
||||||
- `config-validator.ts` - Configuration validation scripts
|
- `config-validator.ts` - Configuration validation scripts
|
||||||
|
|
||||||
### 3. Deployment Scripts
|
### 3. Deployment Scripts
|
||||||
|
|
||||||
- `deploy.ts` - Main deployment orchestrator
|
- `deploy.ts` - Main deployment orchestrator
|
||||||
- `pre-deployment-checks.ts` - Pre-deployment validation
|
- `pre-deployment-checks.ts` - Pre-deployment validation
|
||||||
- `post-deployment-validation.ts` - Post-deployment verification
|
- `post-deployment-validation.ts` - Post-deployment verification
|
||||||
- `rollback.ts` - Rollback procedures
|
- `rollback.ts` - Rollback procedures
|
||||||
|
|
||||||
### 4. Health Checks
|
### 4. Health Checks
|
||||||
|
|
||||||
- `health-checks.ts` - Comprehensive system health validation
|
- `health-checks.ts` - Comprehensive system health validation
|
||||||
- `trpc-endpoint-tests.ts` - tRPC endpoint validation
|
- `trpc-endpoint-tests.ts` - tRPC endpoint validation
|
||||||
- `batch-processing-tests.ts` - Batch processing system tests
|
- `batch-processing-tests.ts` - Batch processing system tests
|
||||||
|
|
||||||
### 5. Migration Utilities
|
### 5. Migration Utilities
|
||||||
|
|
||||||
- `backup-database.ts` - Database backup procedures
|
- `backup-database.ts` - Database backup procedures
|
||||||
- `restore-database.ts` - Database restore procedures
|
- `restore-database.ts` - Database restore procedures
|
||||||
- `migration-logger.ts` - Migration logging utilities
|
- `migration-logger.ts` - Migration logging utilities
|
||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
|
|
||||||
@ -94,9 +94,9 @@ The migration implements a blue-green deployment strategy:
|
|||||||
|
|
||||||
## Safety Features
|
## Safety Features
|
||||||
|
|
||||||
- Automatic database backups before migration
|
- Automatic database backups before migration
|
||||||
- Rollback scripts for quick recovery
|
- Rollback scripts for quick recovery
|
||||||
- Health checks at each stage
|
- Health checks at each stage
|
||||||
- Progressive feature enablement
|
- Progressive feature enablement
|
||||||
- Comprehensive logging and monitoring
|
- Comprehensive logging and monitoring
|
||||||
- Backwards compatibility maintained during migration
|
- Backwards compatibility maintained during migration
|
||||||
|
|||||||
@ -450,7 +450,7 @@ Examples:
|
|||||||
`);
|
`);
|
||||||
process.exit(1);
|
process.exit(1);
|
||||||
}
|
}
|
||||||
}
|
};
|
||||||
|
|
||||||
runCommand()
|
runCommand()
|
||||||
.then((result) => {
|
.then((result) => {
|
||||||
|
|||||||
@ -517,7 +517,7 @@ if (import.meta.url === `file://${process.argv[1]}`) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
return result;
|
return result;
|
||||||
}
|
};
|
||||||
|
|
||||||
runTests()
|
runTests()
|
||||||
.then((result) => {
|
.then((result) => {
|
||||||
|
|||||||
Reference in New Issue
Block a user