mirror of
https://github.com/kjanat/livedash-node.git
synced 2026-01-16 15:32:10 +01:00
Migrate database from SQLite to PostgreSQL
- Replace SQLite with PostgreSQL using Neon as provider - Add environment-based database URL configuration - Create separate test database setup with DATABASE_URL_TEST - Reset migration history and generate fresh PostgreSQL schema - Add comprehensive migration documentation - Include database unit tests for connection validation
This commit is contained in:
130
docs/postgresql-migration.md
Normal file
130
docs/postgresql-migration.md
Normal file
@ -0,0 +1,130 @@
|
|||||||
|
# PostgreSQL Migration Documentation
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
Successfully migrated the livedash-node application from SQLite to PostgreSQL using Neon as the database provider. This migration provides better scalability, performance, and production-readiness.
|
||||||
|
|
||||||
|
## Migration Summary
|
||||||
|
|
||||||
|
### What Was Changed
|
||||||
|
|
||||||
|
1. **Database Provider**: Changed from SQLite to PostgreSQL in `prisma/schema.prisma`
|
||||||
|
2. **Environment Configuration**: Updated to use environment-based database URL selection
|
||||||
|
3. **Test Setup**: Configured separate test database using `DATABASE_URL_TEST`
|
||||||
|
4. **Migration History**: Reset and created fresh PostgreSQL migrations
|
||||||
|
|
||||||
|
### Database Configuration
|
||||||
|
|
||||||
|
#### Production/Development
|
||||||
|
|
||||||
|
- **Provider**: PostgreSQL (Neon)
|
||||||
|
- **Environment Variable**: `DATABASE_URL`
|
||||||
|
- **Connection**: Neon PostgreSQL cluster
|
||||||
|
|
||||||
|
#### Testing
|
||||||
|
|
||||||
|
- **Provider**: PostgreSQL (Neon - separate database)
|
||||||
|
- **Environment Variable**: `DATABASE_URL_TEST`
|
||||||
|
- **Test Setup**: Automatically switches to test database during test runs
|
||||||
|
|
||||||
|
### Files Modified
|
||||||
|
|
||||||
|
1. **`prisma/schema.prisma`**
|
||||||
|
|
||||||
|
- Changed provider from `sqlite` to `postgresql`
|
||||||
|
- Updated URL to use `env("DATABASE_URL")`
|
||||||
|
|
||||||
|
2. **`tests/setup.ts`**
|
||||||
|
|
||||||
|
- Added logic to use `DATABASE_URL_TEST` when available
|
||||||
|
- Ensures test isolation with separate database
|
||||||
|
|
||||||
|
3. **`.env`** (created)
|
||||||
|
|
||||||
|
- Contains `DATABASE_URL` for Prisma CLI operations
|
||||||
|
|
||||||
|
4. **`.env.local`** (existing)
|
||||||
|
|
||||||
|
- Contains both `DATABASE_URL` and `DATABASE_URL_TEST`
|
||||||
|
|
||||||
|
### Database Schema
|
||||||
|
|
||||||
|
All existing models and relationships were preserved:
|
||||||
|
|
||||||
|
- **Company**: Multi-tenant root entity
|
||||||
|
- **User**: Authentication and authorization
|
||||||
|
- **Session**: Processed session data
|
||||||
|
- **SessionImport**: Raw CSV import data
|
||||||
|
- **Message**: Individual conversation messages
|
||||||
|
- **Question**: Normalized question storage
|
||||||
|
- **SessionQuestion**: Session-question relationships
|
||||||
|
- **AIProcessingRequest**: AI cost tracking
|
||||||
|
|
||||||
|
### Migration Process
|
||||||
|
|
||||||
|
1. **Schema Update**: Changed provider to PostgreSQL
|
||||||
|
2. **Migration Reset**: Removed SQLite migration history
|
||||||
|
3. **Fresh Migration**: Created new PostgreSQL migration
|
||||||
|
4. **Client Generation**: Generated new Prisma client for PostgreSQL
|
||||||
|
5. **Database Seeding**: Applied initial seed data
|
||||||
|
6. **Testing**: Verified all functionality works with PostgreSQL
|
||||||
|
|
||||||
|
### Benefits Achieved
|
||||||
|
|
||||||
|
✅ **Production-Ready**: PostgreSQL is enterprise-grade and scalable
|
||||||
|
✅ **Better Performance**: Superior query performance and optimization
|
||||||
|
✅ **Advanced Features**: Full JSON support, arrays, advanced indexing
|
||||||
|
✅ **Test Isolation**: Separate test database prevents data conflicts
|
||||||
|
✅ **Consistency**: Same database engine across all environments
|
||||||
|
✅ **Cloud-Native**: Neon provides managed PostgreSQL with excellent DX
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
```env
|
||||||
|
# Production/Development Database
|
||||||
|
DATABASE_URL="postgresql://user:pass@host/database?sslmode=require"
|
||||||
|
|
||||||
|
# Test Database (separate Neon database)
|
||||||
|
DATABASE_URL_TEST="postgresql://user:pass@test-host/test-database?sslmode=require"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test Configuration
|
||||||
|
|
||||||
|
Tests automatically use the test database when `DATABASE_URL_TEST` is set:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// In tests/setup.ts
|
||||||
|
if (process.env.DATABASE_URL_TEST) {
|
||||||
|
process.env.DATABASE_URL = process.env.DATABASE_URL_TEST;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Verification
|
||||||
|
|
||||||
|
All tests pass successfully:
|
||||||
|
|
||||||
|
- ✅ Environment configuration tests
|
||||||
|
- ✅ Transcript fetcher tests
|
||||||
|
- ✅ Database connection tests
|
||||||
|
- ✅ Schema validation tests
|
||||||
|
- ✅ CRUD operation tests
|
||||||
|
|
||||||
|
### Next Steps
|
||||||
|
|
||||||
|
1. **Data Import**: Import production data if needed
|
||||||
|
2. **Performance Monitoring**: Monitor query performance in production
|
||||||
|
3. **Backup Strategy**: Configure automated backups via Neon
|
||||||
|
4. **Connection Pooling**: Consider connection pooling for high-traffic scenarios
|
||||||
|
|
||||||
|
### Rollback Plan
|
||||||
|
|
||||||
|
If rollback is needed:
|
||||||
|
|
||||||
|
1. Revert `prisma/schema.prisma` to SQLite configuration
|
||||||
|
2. Restore SQLite migration files from git history
|
||||||
|
3. Update environment variables
|
||||||
|
4. Run `prisma migrate reset` and `prisma generate`
|
||||||
|
|
||||||
|
## Conclusion
|
||||||
|
|
||||||
|
The PostgreSQL migration was successful and provides a solid foundation for production deployment. The application now benefits from PostgreSQL's advanced features while maintaining full test isolation and development workflow compatibility.
|
||||||
@ -1,183 +0,0 @@
|
|||||||
-- CreateTable
|
|
||||||
CREATE TABLE "Company" (
|
|
||||||
"id" TEXT NOT NULL PRIMARY KEY,
|
|
||||||
"name" TEXT NOT NULL,
|
|
||||||
"csvUrl" TEXT NOT NULL,
|
|
||||||
"csvUsername" TEXT,
|
|
||||||
"csvPassword" TEXT,
|
|
||||||
"sentimentAlert" REAL,
|
|
||||||
"dashboardOpts" JSONB,
|
|
||||||
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
"updatedAt" DATETIME NOT NULL
|
|
||||||
);
|
|
||||||
|
|
||||||
-- CreateTable
|
|
||||||
CREATE TABLE "User" (
|
|
||||||
"id" TEXT NOT NULL PRIMARY KEY,
|
|
||||||
"email" TEXT NOT NULL,
|
|
||||||
"password" TEXT NOT NULL,
|
|
||||||
"role" TEXT NOT NULL DEFAULT 'USER',
|
|
||||||
"companyId" TEXT NOT NULL,
|
|
||||||
"resetToken" TEXT,
|
|
||||||
"resetTokenExpiry" DATETIME,
|
|
||||||
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
"updatedAt" DATETIME NOT NULL,
|
|
||||||
CONSTRAINT "User_companyId_fkey" FOREIGN KEY ("companyId") REFERENCES "Company" ("id") ON DELETE CASCADE ON UPDATE CASCADE
|
|
||||||
);
|
|
||||||
|
|
||||||
-- CreateTable
|
|
||||||
CREATE TABLE "Session" (
|
|
||||||
"id" TEXT NOT NULL PRIMARY KEY,
|
|
||||||
"companyId" TEXT NOT NULL,
|
|
||||||
"importId" TEXT,
|
|
||||||
"startTime" DATETIME NOT NULL,
|
|
||||||
"endTime" DATETIME NOT NULL,
|
|
||||||
"ipAddress" TEXT,
|
|
||||||
"country" TEXT,
|
|
||||||
"fullTranscriptUrl" TEXT,
|
|
||||||
"avgResponseTime" REAL,
|
|
||||||
"initialMsg" TEXT,
|
|
||||||
"language" TEXT,
|
|
||||||
"messagesSent" INTEGER,
|
|
||||||
"sentiment" TEXT,
|
|
||||||
"escalated" BOOLEAN,
|
|
||||||
"forwardedHr" BOOLEAN,
|
|
||||||
"category" TEXT,
|
|
||||||
"summary" TEXT,
|
|
||||||
"processed" BOOLEAN NOT NULL DEFAULT false,
|
|
||||||
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
"updatedAt" DATETIME NOT NULL,
|
|
||||||
CONSTRAINT "Session_companyId_fkey" FOREIGN KEY ("companyId") REFERENCES "Company" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
|
|
||||||
CONSTRAINT "Session_importId_fkey" FOREIGN KEY ("importId") REFERENCES "SessionImport" ("id") ON DELETE SET NULL ON UPDATE CASCADE
|
|
||||||
);
|
|
||||||
|
|
||||||
-- CreateTable
|
|
||||||
CREATE TABLE "SessionImport" (
|
|
||||||
"id" TEXT NOT NULL PRIMARY KEY,
|
|
||||||
"companyId" TEXT NOT NULL,
|
|
||||||
"externalSessionId" TEXT NOT NULL,
|
|
||||||
"startTimeRaw" TEXT NOT NULL,
|
|
||||||
"endTimeRaw" TEXT NOT NULL,
|
|
||||||
"ipAddress" TEXT,
|
|
||||||
"countryCode" TEXT,
|
|
||||||
"language" TEXT,
|
|
||||||
"messagesSent" INTEGER,
|
|
||||||
"sentimentRaw" TEXT,
|
|
||||||
"escalatedRaw" TEXT,
|
|
||||||
"forwardedHrRaw" TEXT,
|
|
||||||
"fullTranscriptUrl" TEXT,
|
|
||||||
"avgResponseTimeSeconds" REAL,
|
|
||||||
"tokens" INTEGER,
|
|
||||||
"tokensEur" REAL,
|
|
||||||
"category" TEXT,
|
|
||||||
"initialMessage" TEXT,
|
|
||||||
"rawTranscriptContent" TEXT,
|
|
||||||
"status" TEXT NOT NULL DEFAULT 'QUEUED',
|
|
||||||
"errorMsg" TEXT,
|
|
||||||
"processedAt" DATETIME,
|
|
||||||
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
CONSTRAINT "SessionImport_companyId_fkey" FOREIGN KEY ("companyId") REFERENCES "Company" ("id") ON DELETE CASCADE ON UPDATE CASCADE
|
|
||||||
);
|
|
||||||
|
|
||||||
-- CreateTable
|
|
||||||
CREATE TABLE "Message" (
|
|
||||||
"id" TEXT NOT NULL PRIMARY KEY,
|
|
||||||
"sessionId" TEXT NOT NULL,
|
|
||||||
"timestamp" DATETIME,
|
|
||||||
"role" TEXT NOT NULL,
|
|
||||||
"content" TEXT NOT NULL,
|
|
||||||
"order" INTEGER NOT NULL,
|
|
||||||
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
CONSTRAINT "Message_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "Session" ("id") ON DELETE CASCADE ON UPDATE CASCADE
|
|
||||||
);
|
|
||||||
|
|
||||||
-- CreateTable
|
|
||||||
CREATE TABLE "Question" (
|
|
||||||
"id" TEXT NOT NULL PRIMARY KEY,
|
|
||||||
"content" TEXT NOT NULL,
|
|
||||||
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
|
|
||||||
);
|
|
||||||
|
|
||||||
-- CreateTable
|
|
||||||
CREATE TABLE "SessionQuestion" (
|
|
||||||
"id" TEXT NOT NULL PRIMARY KEY,
|
|
||||||
"sessionId" TEXT NOT NULL,
|
|
||||||
"questionId" TEXT NOT NULL,
|
|
||||||
"order" INTEGER NOT NULL,
|
|
||||||
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
CONSTRAINT "SessionQuestion_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "Session" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
|
|
||||||
CONSTRAINT "SessionQuestion_questionId_fkey" FOREIGN KEY ("questionId") REFERENCES "Question" ("id") ON DELETE RESTRICT ON UPDATE CASCADE
|
|
||||||
);
|
|
||||||
|
|
||||||
-- CreateTable
|
|
||||||
CREATE TABLE "AIProcessingRequest" (
|
|
||||||
"id" TEXT NOT NULL PRIMARY KEY,
|
|
||||||
"sessionId" TEXT NOT NULL,
|
|
||||||
"openaiRequestId" TEXT,
|
|
||||||
"model" TEXT NOT NULL,
|
|
||||||
"serviceTier" TEXT,
|
|
||||||
"systemFingerprint" TEXT,
|
|
||||||
"promptTokens" INTEGER NOT NULL,
|
|
||||||
"completionTokens" INTEGER NOT NULL,
|
|
||||||
"totalTokens" INTEGER NOT NULL,
|
|
||||||
"cachedTokens" INTEGER,
|
|
||||||
"audioTokensPrompt" INTEGER,
|
|
||||||
"reasoningTokens" INTEGER,
|
|
||||||
"audioTokensCompletion" INTEGER,
|
|
||||||
"acceptedPredictionTokens" INTEGER,
|
|
||||||
"rejectedPredictionTokens" INTEGER,
|
|
||||||
"promptTokenCost" REAL NOT NULL,
|
|
||||||
"completionTokenCost" REAL NOT NULL,
|
|
||||||
"totalCostEur" REAL NOT NULL,
|
|
||||||
"processingType" TEXT NOT NULL,
|
|
||||||
"success" BOOLEAN NOT NULL,
|
|
||||||
"errorMessage" TEXT,
|
|
||||||
"requestedAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
"completedAt" DATETIME,
|
|
||||||
CONSTRAINT "AIProcessingRequest_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "Session" ("id") ON DELETE CASCADE ON UPDATE CASCADE
|
|
||||||
);
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE UNIQUE INDEX "User_email_key" ON "User"("email");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE UNIQUE INDEX "Session_importId_key" ON "Session"("importId");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE INDEX "Session_companyId_startTime_idx" ON "Session"("companyId", "startTime");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE UNIQUE INDEX "SessionImport_externalSessionId_key" ON "SessionImport"("externalSessionId");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE INDEX "SessionImport_status_idx" ON "SessionImport"("status");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE UNIQUE INDEX "SessionImport_companyId_externalSessionId_key" ON "SessionImport"("companyId", "externalSessionId");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE INDEX "Message_sessionId_order_idx" ON "Message"("sessionId", "order");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE UNIQUE INDEX "Message_sessionId_order_key" ON "Message"("sessionId", "order");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE UNIQUE INDEX "Question_content_key" ON "Question"("content");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE INDEX "SessionQuestion_sessionId_idx" ON "SessionQuestion"("sessionId");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE UNIQUE INDEX "SessionQuestion_sessionId_questionId_key" ON "SessionQuestion"("sessionId", "questionId");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE UNIQUE INDEX "SessionQuestion_sessionId_order_key" ON "SessionQuestion"("sessionId", "order");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE INDEX "AIProcessingRequest_sessionId_idx" ON "AIProcessingRequest"("sessionId");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE INDEX "AIProcessingRequest_requestedAt_idx" ON "AIProcessingRequest"("requestedAt");
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE INDEX "AIProcessingRequest_model_idx" ON "AIProcessingRequest"("model");
|
|
||||||
@ -0,0 +1,227 @@
|
|||||||
|
-- CreateEnum
|
||||||
|
CREATE TYPE "UserRole" AS ENUM ('ADMIN', 'USER', 'AUDITOR');
|
||||||
|
|
||||||
|
-- CreateEnum
|
||||||
|
CREATE TYPE "SentimentCategory" AS ENUM ('POSITIVE', 'NEUTRAL', 'NEGATIVE');
|
||||||
|
|
||||||
|
-- CreateEnum
|
||||||
|
CREATE TYPE "SessionCategory" AS ENUM ('SCHEDULE_HOURS', 'LEAVE_VACATION', 'SICK_LEAVE_RECOVERY', 'SALARY_COMPENSATION', 'CONTRACT_HOURS', 'ONBOARDING', 'OFFBOARDING', 'WORKWEAR_STAFF_PASS', 'TEAM_CONTACTS', 'PERSONAL_QUESTIONS', 'ACCESS_LOGIN', 'SOCIAL_QUESTIONS', 'UNRECOGNIZED_OTHER');
|
||||||
|
|
||||||
|
-- CreateEnum
|
||||||
|
CREATE TYPE "ImportStatus" AS ENUM ('QUEUED', 'PROCESSING', 'DONE', 'ERROR');
|
||||||
|
|
||||||
|
-- CreateTable
|
||||||
|
CREATE TABLE "Company" (
|
||||||
|
"id" TEXT NOT NULL,
|
||||||
|
"name" TEXT NOT NULL,
|
||||||
|
"csvUrl" TEXT NOT NULL,
|
||||||
|
"csvUsername" TEXT,
|
||||||
|
"csvPassword" TEXT,
|
||||||
|
"sentimentAlert" DOUBLE PRECISION,
|
||||||
|
"dashboardOpts" JSONB,
|
||||||
|
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
"updatedAt" TIMESTAMP(3) NOT NULL,
|
||||||
|
|
||||||
|
CONSTRAINT "Company_pkey" PRIMARY KEY ("id")
|
||||||
|
);
|
||||||
|
|
||||||
|
-- CreateTable
|
||||||
|
CREATE TABLE "User" (
|
||||||
|
"id" TEXT NOT NULL,
|
||||||
|
"email" TEXT NOT NULL,
|
||||||
|
"password" TEXT NOT NULL,
|
||||||
|
"role" "UserRole" NOT NULL DEFAULT 'USER',
|
||||||
|
"companyId" TEXT NOT NULL,
|
||||||
|
"resetToken" TEXT,
|
||||||
|
"resetTokenExpiry" TIMESTAMP(3),
|
||||||
|
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
"updatedAt" TIMESTAMP(3) NOT NULL,
|
||||||
|
|
||||||
|
CONSTRAINT "User_pkey" PRIMARY KEY ("id")
|
||||||
|
);
|
||||||
|
|
||||||
|
-- CreateTable
|
||||||
|
CREATE TABLE "Session" (
|
||||||
|
"id" TEXT NOT NULL,
|
||||||
|
"companyId" TEXT NOT NULL,
|
||||||
|
"importId" TEXT,
|
||||||
|
"startTime" TIMESTAMP(3) NOT NULL,
|
||||||
|
"endTime" TIMESTAMP(3) NOT NULL,
|
||||||
|
"ipAddress" TEXT,
|
||||||
|
"country" TEXT,
|
||||||
|
"fullTranscriptUrl" TEXT,
|
||||||
|
"avgResponseTime" DOUBLE PRECISION,
|
||||||
|
"initialMsg" TEXT,
|
||||||
|
"language" TEXT,
|
||||||
|
"messagesSent" INTEGER,
|
||||||
|
"sentiment" "SentimentCategory",
|
||||||
|
"escalated" BOOLEAN,
|
||||||
|
"forwardedHr" BOOLEAN,
|
||||||
|
"category" "SessionCategory",
|
||||||
|
"summary" TEXT,
|
||||||
|
"processed" BOOLEAN NOT NULL DEFAULT false,
|
||||||
|
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
"updatedAt" TIMESTAMP(3) NOT NULL,
|
||||||
|
|
||||||
|
CONSTRAINT "Session_pkey" PRIMARY KEY ("id")
|
||||||
|
);
|
||||||
|
|
||||||
|
-- CreateTable
|
||||||
|
CREATE TABLE "SessionImport" (
|
||||||
|
"id" TEXT NOT NULL,
|
||||||
|
"companyId" TEXT NOT NULL,
|
||||||
|
"externalSessionId" TEXT NOT NULL,
|
||||||
|
"startTimeRaw" TEXT NOT NULL,
|
||||||
|
"endTimeRaw" TEXT NOT NULL,
|
||||||
|
"ipAddress" TEXT,
|
||||||
|
"countryCode" TEXT,
|
||||||
|
"language" TEXT,
|
||||||
|
"messagesSent" INTEGER,
|
||||||
|
"sentimentRaw" TEXT,
|
||||||
|
"escalatedRaw" TEXT,
|
||||||
|
"forwardedHrRaw" TEXT,
|
||||||
|
"fullTranscriptUrl" TEXT,
|
||||||
|
"avgResponseTimeSeconds" DOUBLE PRECISION,
|
||||||
|
"tokens" INTEGER,
|
||||||
|
"tokensEur" DOUBLE PRECISION,
|
||||||
|
"category" TEXT,
|
||||||
|
"initialMessage" TEXT,
|
||||||
|
"rawTranscriptContent" TEXT,
|
||||||
|
"status" "ImportStatus" NOT NULL DEFAULT 'QUEUED',
|
||||||
|
"errorMsg" TEXT,
|
||||||
|
"processedAt" TIMESTAMP(3),
|
||||||
|
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
|
||||||
|
CONSTRAINT "SessionImport_pkey" PRIMARY KEY ("id")
|
||||||
|
);
|
||||||
|
|
||||||
|
-- CreateTable
|
||||||
|
CREATE TABLE "Message" (
|
||||||
|
"id" TEXT NOT NULL,
|
||||||
|
"sessionId" TEXT NOT NULL,
|
||||||
|
"timestamp" TIMESTAMP(3),
|
||||||
|
"role" TEXT NOT NULL,
|
||||||
|
"content" TEXT NOT NULL,
|
||||||
|
"order" INTEGER NOT NULL,
|
||||||
|
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
|
||||||
|
CONSTRAINT "Message_pkey" PRIMARY KEY ("id")
|
||||||
|
);
|
||||||
|
|
||||||
|
-- CreateTable
|
||||||
|
CREATE TABLE "Question" (
|
||||||
|
"id" TEXT NOT NULL,
|
||||||
|
"content" TEXT NOT NULL,
|
||||||
|
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
|
||||||
|
CONSTRAINT "Question_pkey" PRIMARY KEY ("id")
|
||||||
|
);
|
||||||
|
|
||||||
|
-- CreateTable
|
||||||
|
CREATE TABLE "SessionQuestion" (
|
||||||
|
"id" TEXT NOT NULL,
|
||||||
|
"sessionId" TEXT NOT NULL,
|
||||||
|
"questionId" TEXT NOT NULL,
|
||||||
|
"order" INTEGER NOT NULL,
|
||||||
|
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
|
||||||
|
CONSTRAINT "SessionQuestion_pkey" PRIMARY KEY ("id")
|
||||||
|
);
|
||||||
|
|
||||||
|
-- CreateTable
|
||||||
|
CREATE TABLE "AIProcessingRequest" (
|
||||||
|
"id" TEXT NOT NULL,
|
||||||
|
"sessionId" TEXT NOT NULL,
|
||||||
|
"openaiRequestId" TEXT,
|
||||||
|
"model" TEXT NOT NULL,
|
||||||
|
"serviceTier" TEXT,
|
||||||
|
"systemFingerprint" TEXT,
|
||||||
|
"promptTokens" INTEGER NOT NULL,
|
||||||
|
"completionTokens" INTEGER NOT NULL,
|
||||||
|
"totalTokens" INTEGER NOT NULL,
|
||||||
|
"cachedTokens" INTEGER,
|
||||||
|
"audioTokensPrompt" INTEGER,
|
||||||
|
"reasoningTokens" INTEGER,
|
||||||
|
"audioTokensCompletion" INTEGER,
|
||||||
|
"acceptedPredictionTokens" INTEGER,
|
||||||
|
"rejectedPredictionTokens" INTEGER,
|
||||||
|
"promptTokenCost" DOUBLE PRECISION NOT NULL,
|
||||||
|
"completionTokenCost" DOUBLE PRECISION NOT NULL,
|
||||||
|
"totalCostEur" DOUBLE PRECISION NOT NULL,
|
||||||
|
"processingType" TEXT NOT NULL,
|
||||||
|
"success" BOOLEAN NOT NULL,
|
||||||
|
"errorMessage" TEXT,
|
||||||
|
"requestedAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
"completedAt" TIMESTAMP(3),
|
||||||
|
|
||||||
|
CONSTRAINT "AIProcessingRequest_pkey" PRIMARY KEY ("id")
|
||||||
|
);
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE UNIQUE INDEX "User_email_key" ON "User"("email");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE UNIQUE INDEX "Session_importId_key" ON "Session"("importId");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE INDEX "Session_companyId_startTime_idx" ON "Session"("companyId", "startTime");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE UNIQUE INDEX "SessionImport_externalSessionId_key" ON "SessionImport"("externalSessionId");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE INDEX "SessionImport_status_idx" ON "SessionImport"("status");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE UNIQUE INDEX "SessionImport_companyId_externalSessionId_key" ON "SessionImport"("companyId", "externalSessionId");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE INDEX "Message_sessionId_order_idx" ON "Message"("sessionId", "order");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE UNIQUE INDEX "Message_sessionId_order_key" ON "Message"("sessionId", "order");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE UNIQUE INDEX "Question_content_key" ON "Question"("content");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE INDEX "SessionQuestion_sessionId_idx" ON "SessionQuestion"("sessionId");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE UNIQUE INDEX "SessionQuestion_sessionId_questionId_key" ON "SessionQuestion"("sessionId", "questionId");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE UNIQUE INDEX "SessionQuestion_sessionId_order_key" ON "SessionQuestion"("sessionId", "order");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE INDEX "AIProcessingRequest_sessionId_idx" ON "AIProcessingRequest"("sessionId");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE INDEX "AIProcessingRequest_requestedAt_idx" ON "AIProcessingRequest"("requestedAt");
|
||||||
|
|
||||||
|
-- CreateIndex
|
||||||
|
CREATE INDEX "AIProcessingRequest_model_idx" ON "AIProcessingRequest"("model");
|
||||||
|
|
||||||
|
-- AddForeignKey
|
||||||
|
ALTER TABLE "User" ADD CONSTRAINT "User_companyId_fkey" FOREIGN KEY ("companyId") REFERENCES "Company"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||||
|
|
||||||
|
-- AddForeignKey
|
||||||
|
ALTER TABLE "Session" ADD CONSTRAINT "Session_companyId_fkey" FOREIGN KEY ("companyId") REFERENCES "Company"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||||
|
|
||||||
|
-- AddForeignKey
|
||||||
|
ALTER TABLE "Session" ADD CONSTRAINT "Session_importId_fkey" FOREIGN KEY ("importId") REFERENCES "SessionImport"("id") ON DELETE SET NULL ON UPDATE CASCADE;
|
||||||
|
|
||||||
|
-- AddForeignKey
|
||||||
|
ALTER TABLE "SessionImport" ADD CONSTRAINT "SessionImport_companyId_fkey" FOREIGN KEY ("companyId") REFERENCES "Company"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||||
|
|
||||||
|
-- AddForeignKey
|
||||||
|
ALTER TABLE "Message" ADD CONSTRAINT "Message_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "Session"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||||
|
|
||||||
|
-- AddForeignKey
|
||||||
|
ALTER TABLE "SessionQuestion" ADD CONSTRAINT "SessionQuestion_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "Session"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||||
|
|
||||||
|
-- AddForeignKey
|
||||||
|
ALTER TABLE "SessionQuestion" ADD CONSTRAINT "SessionQuestion_questionId_fkey" FOREIGN KEY ("questionId") REFERENCES "Question"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
|
||||||
|
|
||||||
|
-- AddForeignKey
|
||||||
|
ALTER TABLE "AIProcessingRequest" ADD CONSTRAINT "AIProcessingRequest_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "Session"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||||
@ -1,3 +1,3 @@
|
|||||||
# Please do not edit this file manually
|
# Please do not edit this file manually
|
||||||
# It should be added in your version-control system (e.g., Git)
|
# It should be added in your version-control system (e.g., Git)
|
||||||
provider = "sqlite"
|
provider = "postgresql"
|
||||||
|
|||||||
@ -3,8 +3,8 @@ generator client {
|
|||||||
}
|
}
|
||||||
|
|
||||||
datasource db {
|
datasource db {
|
||||||
provider = "sqlite" // still OK for local/dev; use Postgres in prod
|
provider = "postgresql"
|
||||||
url = "file:./dev.db"
|
url = env("DATABASE_URL")
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|||||||
@ -13,6 +13,11 @@ global.console = {
|
|||||||
process.env.NEXTAUTH_SECRET = 'test-secret';
|
process.env.NEXTAUTH_SECRET = 'test-secret';
|
||||||
process.env.NEXTAUTH_URL = 'http://localhost:3000';
|
process.env.NEXTAUTH_URL = 'http://localhost:3000';
|
||||||
|
|
||||||
|
// Use test database for all database operations during tests
|
||||||
|
if (process.env.DATABASE_URL_TEST) {
|
||||||
|
process.env.DATABASE_URL = process.env.DATABASE_URL_TEST;
|
||||||
|
}
|
||||||
|
|
||||||
// Mock node-fetch for transcript fetcher tests
|
// Mock node-fetch for transcript fetcher tests
|
||||||
vi.mock('node-fetch', () => ({
|
vi.mock('node-fetch', () => ({
|
||||||
default: vi.fn(),
|
default: vi.fn(),
|
||||||
|
|||||||
77
tests/unit/database.test.ts
Normal file
77
tests/unit/database.test.ts
Normal file
@ -0,0 +1,77 @@
|
|||||||
|
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
|
||||||
|
import { PrismaClient } from '@prisma/client';
|
||||||
|
|
||||||
|
describe('Database Configuration', () => {
|
||||||
|
let prisma: PrismaClient;
|
||||||
|
|
||||||
|
beforeAll(() => {
|
||||||
|
prisma = new PrismaClient();
|
||||||
|
});
|
||||||
|
|
||||||
|
afterAll(async () => {
|
||||||
|
await prisma.$disconnect();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should connect to the test database', async () => {
|
||||||
|
// Verify we can connect to the database
|
||||||
|
const result = await prisma.$queryRaw`SELECT 1 as test`;
|
||||||
|
expect(result).toBeDefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should use PostgreSQL as the database provider', async () => {
|
||||||
|
// Query the database to verify it's PostgreSQL
|
||||||
|
const result = await prisma.$queryRaw`SELECT version()` as any[];
|
||||||
|
expect(result[0].version).toContain('PostgreSQL');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should be using the test database URL', () => {
|
||||||
|
// Verify that DATABASE_URL is set to the test database
|
||||||
|
expect(process.env.DATABASE_URL).toBeDefined();
|
||||||
|
expect(process.env.DATABASE_URL).toContain('postgresql://');
|
||||||
|
|
||||||
|
// If DATABASE_URL_TEST is set, DATABASE_URL should match it (from our test setup)
|
||||||
|
if (process.env.DATABASE_URL_TEST) {
|
||||||
|
expect(process.env.DATABASE_URL).toBe(process.env.DATABASE_URL_TEST);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should have all required tables', async () => {
|
||||||
|
// Verify all our tables exist
|
||||||
|
const tables = await prisma.$queryRaw`
|
||||||
|
SELECT table_name
|
||||||
|
FROM information_schema.tables
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_type = 'BASE TABLE'
|
||||||
|
ORDER BY table_name
|
||||||
|
` as any[];
|
||||||
|
|
||||||
|
const tableNames = tables.map(t => t.table_name);
|
||||||
|
|
||||||
|
expect(tableNames).toContain('Company');
|
||||||
|
expect(tableNames).toContain('User');
|
||||||
|
expect(tableNames).toContain('Session');
|
||||||
|
expect(tableNames).toContain('SessionImport');
|
||||||
|
expect(tableNames).toContain('Message');
|
||||||
|
expect(tableNames).toContain('Question');
|
||||||
|
expect(tableNames).toContain('SessionQuestion');
|
||||||
|
expect(tableNames).toContain('AIProcessingRequest');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should be able to create and query data', async () => {
|
||||||
|
// Test basic CRUD operations
|
||||||
|
const company = await prisma.company.create({
|
||||||
|
data: {
|
||||||
|
name: 'Test Company',
|
||||||
|
csvUrl: 'https://example.com/test.csv',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(company.id).toBeDefined();
|
||||||
|
expect(company.name).toBe('Test Company');
|
||||||
|
|
||||||
|
// Clean up
|
||||||
|
await prisma.company.delete({
|
||||||
|
where: { id: company.id },
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
Reference in New Issue
Block a user