mirror of
https://github.com/kjanat/livedash-node.git
synced 2026-01-16 11:12:11 +01:00
- Add robust email service with rate limiting and configuration management - Implement shared rate limiter utility for consistent API protection - Create comprehensive test suite for core processing pipeline - Add API tests for dashboard metrics and authentication routes - Fix date range picker infinite loop issue - Improve session lookup in refresh sessions API - Refactor session API routing with better code organization - Update processing pipeline status monitoring - Clean up leftover files and improve code formatting
100 lines
3.1 KiB
Markdown
100 lines
3.1 KiB
Markdown
# 🚨 Database Connection Issues - Fixes Applied
|
|
|
|
## Issues Identified
|
|
|
|
From your logs:
|
|
|
|
```
|
|
Can't reach database server at `ep-tiny-math-a2zsshve-pooler.eu-central-1.aws.neon.tech:5432`
|
|
[NODE-CRON] [WARN] missed execution! Possible blocking IO or high CPU
|
|
```
|
|
|
|
## Root Causes
|
|
|
|
1. **Multiple PrismaClient instances** across schedulers
|
|
2. **No connection retry logic** for temporary failures
|
|
3. **No connection pooling optimization** for Neon
|
|
4. **Aggressive scheduler intervals** overwhelming database
|
|
|
|
## Fixes Applied ✅
|
|
|
|
### 1. Connection Retry Logic (`lib/database-retry.ts`)
|
|
|
|
- **Automatic retry** for connection errors
|
|
- **Exponential backoff** (1s → 2s → 4s → 10s max)
|
|
- **Smart error detection** (only retry connection issues)
|
|
- **Configurable retry attempts** (default: 3 retries)
|
|
|
|
### 2. Enhanced Schedulers
|
|
|
|
- **Import Processor**: Added retry wrapper around main processing
|
|
- **Session Processor**: Added retry wrapper around AI processing
|
|
- **Graceful degradation** when database is temporarily unavailable
|
|
|
|
### 3. Singleton Pattern Enforced
|
|
|
|
- **All schedulers now use** `import { prisma } from "./prisma.js"`
|
|
- **No more separate** `new PrismaClient()` instances
|
|
- **Shared connection pool** across all operations
|
|
|
|
### 4. Neon-Specific Optimizations
|
|
|
|
- **Connection limit guidance**: 15 connections (below Neon's 20 limit)
|
|
- **Extended timeouts**: 30s for cold start handling
|
|
- **SSL mode requirements**: `sslmode=require` for Neon
|
|
- **Application naming**: For better monitoring
|
|
|
|
## Immediate Actions Needed
|
|
|
|
### 1. Update Environment Variables
|
|
|
|
```bash
|
|
# Add to .env.local
|
|
USE_ENHANCED_POOLING=true
|
|
DATABASE_CONNECTION_LIMIT=15
|
|
DATABASE_POOL_TIMEOUT=30
|
|
|
|
# Update your DATABASE_URL to include:
|
|
DATABASE_URL="postgresql://user:pass@ep-tiny-math-a2zsshve-pooler.eu-central-1.aws.neon.tech:5432/db?sslmode=require&connection_limit=15&pool_timeout=30"
|
|
```
|
|
|
|
### 2. Reduce Scheduler Frequency (Optional)
|
|
|
|
```bash
|
|
# Less aggressive intervals
|
|
CSV_IMPORT_INTERVAL="*/30 * * * *" # Every 30 min (was 15)
|
|
IMPORT_PROCESSING_INTERVAL="*/10 * * * *" # Every 10 min (was 5)
|
|
SESSION_PROCESSING_INTERVAL="0 */2 * * *" # Every 2 hours (was 1)
|
|
```
|
|
|
|
### 3. Run Configuration Check
|
|
|
|
```bash
|
|
pnpm db:check
|
|
```
|
|
|
|
## Expected Results
|
|
|
|
✅ **Connection Stability**: Automatic retry on temporary failures
|
|
✅ **Resource Efficiency**: Single shared connection pool
|
|
✅ **Neon Optimization**: Proper connection limits and timeouts
|
|
✅ **Monitoring**: Health check endpoint for visibility
|
|
✅ **Graceful Degradation**: Schedulers won't crash on DB issues
|
|
|
|
## Monitoring
|
|
|
|
- **Health Endpoint**: `/api/admin/database-health`
|
|
- **Connection Logs**: Enhanced logging for pool events
|
|
- **Retry Logs**: Detailed retry attempt logging
|
|
- **Error Classification**: Retryable vs non-retryable errors
|
|
|
|
## Files Modified
|
|
|
|
- `lib/database-retry.ts` - New retry utilities
|
|
- `lib/importProcessor.ts` - Added retry wrapper
|
|
- `lib/processingScheduler.ts` - Added retry wrapper
|
|
- `docs/neon-database-optimization.md` - Neon-specific guide
|
|
- `scripts/check-database-config.ts` - Configuration checker
|
|
|
|
The connection issues should be significantly reduced with these fixes! 🎯
|