mirror of
https://github.com/kjanat/livedash-node.git
synced 2026-01-16 11:12:11 +01:00
- Set up pre-commit hooks with husky and lint-staged for automated code quality - Improved TypeScript type safety by replacing 'any' types with proper generics - Fixed markdown linting violations (MD030 spacing) across all documentation - Fixed compound adjective hyphenation in technical documentation - Fixed invalid JSON union syntax in API documentation examples - Automated code formatting and linting on commit - Enhanced error handling with better type constraints - Configured biome and markdownlint for consistent code style - All changes verified with successful production build
12 KiB
12 KiB
LiveDash Node Migration Guide v2.0.0
Overview
This guide provides step-by-step instructions for migrating LiveDash Node to version 2.0.0, which introduces tRPC implementation and OpenAI Batch API integration for improved performance and cost efficiency.
🚀 New Features
tRPC Implementation
- Type-safe APIs: End-to-end TypeScript safety from client to server
- Improved Performance: Optimized query batching and caching
- Better Developer Experience: Auto-completion and type checking
- Simplified Authentication: Integrated with existing NextAuth.js setup
OpenAI Batch API Integration
- 50% Cost Reduction: Batch processing reduces OpenAI API costs by half
- Enhanced Rate Limiting: Better throughput management
- Improved Reliability: Automatic retry mechanisms and error handling
- Automated Processing: Background batch job lifecycle management
Enhanced Security & Performance
- Rate Limiting: In-memory rate limiting for all authentication endpoints
- Input Validation: Comprehensive Zod schemas for all user inputs
- Performance Monitoring: Built-in metrics collection and monitoring
- Database Optimizations: New indexes and query optimizations
📋 Pre-Migration Checklist
System Requirements
- Node.js 18+ installed
- PostgreSQL 13+ database
pg_dumpandpg_restoreutilities available- Git repository with clean working directory
- OpenAI API key (for production)
- Sufficient disk space for backups (at least 2GB)
Environment Preparation
- Review current environment variables
- Ensure database connection is working
- Verify all tests are passing
- Create a backup of your current deployment
- Notify team members of planned downtime
🔧 Migration Process
Phase 1: Pre-Migration Setup
1.1 Install Migration Tools
# Ensure you have the latest dependencies
pnpm install
# Verify migration scripts are available
pnpm migration:validate-env --help
1.2 Run Pre-Deployment Checks
# Run comprehensive pre-deployment validation
pnpm migration:pre-check
# This will validate:
# - Environment configuration
# - Database connection and schema
# - Dependencies
# - File system permissions
# - OpenAI API access
# - tRPC infrastructure readiness
1.3 Environment Configuration
# Generate new environment variables
pnpm migration:migrate-env
# Review the generated files:
# - .env.migration.template
# - ENVIRONMENT_MIGRATION_GUIDE.md
Add these new environment variables to your .env.local:
# tRPC Configuration
TRPC_ENDPOINT_URL="http://localhost:3000/api/trpc"
TRPC_BATCH_TIMEOUT="30000"
TRPC_MAX_BATCH_SIZE="100"
# Batch Processing Configuration
BATCH_PROCESSING_ENABLED="true"
BATCH_CREATE_INTERVAL="*/5 * * * *"
BATCH_STATUS_CHECK_INTERVAL="*/2 * * * *"
BATCH_RESULT_PROCESSING_INTERVAL="*/1 * * * *"
BATCH_MAX_REQUESTS="1000"
BATCH_TIMEOUT_HOURS="24"
# Security & Performance
RATE_LIMIT_WINDOW_MS="900000"
RATE_LIMIT_MAX_REQUESTS="100"
PERFORMANCE_MONITORING_ENABLED="true"
METRICS_COLLECTION_INTERVAL="60"
# Migration Settings (temporary)
MIGRATION_MODE="production"
MIGRATION_BACKUP_ENABLED="true"
MIGRATION_ROLLBACK_ENABLED="true"
Phase 2: Database Migration
2.1 Create Database Backup
# Create full database backup
pnpm migration:backup
# Verify backup was created
pnpm migration:backup list
2.2 Validate Database Schema
# Validate current database state
pnpm migration:validate-db
2.3 Apply Database Migrations
# Run Prisma migrations
pnpm prisma:migrate
# Apply additional schema changes
psql $DATABASE_URL -f scripts/migration/01-schema-migrations.sql
# Verify migration success
pnpm migration:validate-db
Phase 3: Application Deployment
3.1 Dry Run Deployment
# Test deployment process without making changes
pnpm migration:deploy:dry-run
3.2 Full Deployment
# Execute full deployment
pnpm migration:deploy
# This will:
# 1. Apply database schema changes
# 2. Deploy new application code
# 3. Restart services with minimal downtime
# 4. Enable tRPC endpoints progressively
# 5. Activate batch processing system
# 6. Run post-deployment validation
Phase 4: Post-Migration Validation
4.1 System Health Check
# Run comprehensive health checks
pnpm migration:health-check
# Generate detailed health report
pnpm migration:health-report
4.2 Feature Validation
# Test tRPC endpoints
pnpm exec tsx scripts/migration/trpc-endpoint-tests.ts
# Test batch processing system
pnpm exec tsx scripts/migration/batch-processing-tests.ts
# Run full test suite
pnpm migration:test
🔄 Rollback Procedure
If issues occur during migration, you can rollback using these steps:
Automatic Rollback
# Quick rollback (if migration failed)
pnpm migration:rollback
# Dry run rollback to see what would happen
pnpm migration:rollback:dry-run
Manual Rollback Steps
- Stop the application
- Restore database from backup
- Revert to previous code version
- Restart services
- Verify system functionality
Rollback Commands
# Create rollback snapshot (before migration)
pnpm migration:rollback:snapshot
# Restore from specific backup
pnpm migration:rollback --backup /path/to/backup.sql
# Skip database rollback (code only)
pnpm migration:rollback --no-database
📊 Monitoring and Validation
Post-Migration Monitoring
1. Application Health
# Check system health every hour for the first day
*/60 * * * * cd /path/to/livedash && pnpm migration:health-check
# Monitor logs for errors
tail -f logs/migration.log
2. tRPC Performance
- Monitor response times for tRPC endpoints
- Check error rates in application logs
- Verify type safety is working correctly
3. Batch Processing
- Monitor batch job completion rates
- Check OpenAI API cost reduction
- Verify AI processing pipeline functionality
Key Metrics to Monitor
Performance Metrics
- Response Times: tRPC endpoints should respond within 500ms
- Database Queries: Complex queries should complete within 1s
- Memory Usage: Should remain below 80% of allocated memory
- CPU Usage: Process should remain responsive
Business Metrics
- AI Processing Cost: Should see ~50% reduction in OpenAI costs
- Processing Throughput: Batch processing should handle larger volumes
- Error Rates: Should remain below 1% for critical operations
- User Experience: No degradation in dashboard performance
🛠 Troubleshooting
Common Issues and Solutions
tRPC Endpoints Not Working
# Check if tRPC files exist
ls -la app/api/trpc/[trpc]/route.ts
ls -la server/routers/_app.ts
# Verify tRPC router exports
pnpm exec tsx -e "import('./server/routers/_app').then(m => console.log(Object.keys(m)))"
# Test endpoints manually
curl -X POST http://localhost:3000/api/trpc/auth.getSession \
-H "Content-Type: application/json" \
-d '{"json": null}'
Batch Processing Issues
# Check batch processing components
pnpm exec tsx scripts/migration/batch-processing-tests.ts
# Verify OpenAI API access
curl -H "Authorization: Bearer $OPENAI_API_KEY" \
https://api.openai.com/v1/models
# Check batch job status
psql $DATABASE_URL -c "SELECT status, COUNT(*) FROM \"AIBatchRequest\" GROUP BY status;"
Database Issues
# Check database connection
pnpm db:check
# Verify schema integrity
pnpm migration:validate-db
# Check for missing indexes
psql $DATABASE_URL -c "
SELECT schemaname, tablename, indexname
FROM pg_indexes
WHERE tablename IN ('Session', 'AIProcessingRequest', 'AIBatchRequest')
ORDER BY tablename, indexname;
"
Environment Configuration Issues
# Validate environment variables
pnpm migration:validate-env
# Check for missing variables
env | grep -E "(TRPC|BATCH|RATE_LIMIT)" | sort
# Verify environment file syntax
node -e "require('dotenv').config({path: '.env.local'}); console.log('✅ Environment file is valid')"
Getting Help
Support Channels
- Check Migration Logs: Review
logs/migration.logfor detailed error information - Run Diagnostics: Use the built-in health check and validation tools
- Documentation: Refer to component-specific documentation in
docs/ - Emergency Rollback: Use rollback procedures if issues persist
Useful Commands
# Get detailed system information
pnpm migration:health-report
# Check all migration script availability
ls -la scripts/migration/
# Verify package integrity
pnpm install --frozen-lockfile
# Test database connectivity
pnpm prisma db pull --print
📝 Post-Migration Tasks
Immediate Tasks (First 24 Hours)
- Monitor application logs for errors
- Verify all tRPC endpoints are responding correctly
- Check batch processing job completion
- Validate AI cost reduction in OpenAI dashboard
- Run full test suite to ensure no regressions
- Update documentation and team knowledge
Medium-term Tasks (First Week)
- Optimize batch processing parameters based on usage
- Fine-tune rate limiting settings
- Set up monitoring alerts for new components
- Train team on new tRPC APIs
- Plan gradual feature adoption
Long-term Tasks (First Month)
- Analyze cost savings and performance improvements
- Consider additional tRPC endpoint implementations
- Optimize batch processing schedules
- Review and adjust security settings
- Plan next phase improvements
🔒 Security Considerations
New Security Features
- Enhanced Rate Limiting: Applied to all authentication endpoints
- Input Validation: Comprehensive Zod schemas prevent injection attacks
- Secure Headers: HTTPS enforcement in production
- Token Security: JWT with proper expiration and rotation
Security Checklist
- Verify rate limiting is working correctly
- Test input validation on all forms
- Ensure HTTPS is enforced in production
- Validate JWT token handling
- Check for proper error message sanitization
- Verify OpenAI API key is not exposed in logs
📈 Expected Improvements
Performance Improvements
- 50% reduction in OpenAI API costs through batch processing
- 30% improvement in API response times with tRPC
- 25% reduction in database query time with new indexes
- Enhanced scalability for processing larger session volumes
Developer Experience
- Type Safety: End-to-end TypeScript types from client to server
- Better APIs: Self-documenting tRPC procedures
- Improved Testing: More reliable test suite with better validation
- Enhanced Monitoring: Detailed health checks and reporting
Operational Benefits
- Automated Batch Processing: Reduced manual intervention
- Better Error Handling: Comprehensive retry mechanisms
- Improved Monitoring: Real-time health status and metrics
- Simplified Deployment: Automated migration and rollback procedures
📞 Support
For issues during migration:
- Check the logs:
logs/migration.log - Run health checks:
pnpm migration:health-check - Review troubleshooting section above
- Use rollback if needed:
pnpm migration:rollback
Migration completed successfully? 🎉
Your LiveDash Node application is now running version 2.0.0 with tRPC and Batch API integration!
Migration Guide v2.0.0 - Updated January 2025