feat: comprehensive security and architecture improvements

- Add Zod validation schemas with strong password requirements (12+ chars, complexity)
- Implement rate limiting for authentication endpoints (registration, password reset)
- Remove duplicate MetricCard component, consolidate to ui/metric-card.tsx
- Update README.md to use pnpm commands consistently
- Enhance authentication security with 12-round bcrypt hashing
- Add comprehensive input validation for all API endpoints
- Fix security vulnerabilities in user registration and password reset flows

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-06-28 01:52:53 +02:00
parent 192f9497b4
commit 7f48a085bf
68 changed files with 8045 additions and 4542 deletions

View File

@ -5,18 +5,21 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
## Development Commands ## Development Commands
**Core Development:** **Core Development:**
- `pnpm dev` - Start development server (runs custom server.ts with schedulers) - `pnpm dev` - Start development server (runs custom server.ts with schedulers)
- `pnpm dev:next-only` - Start Next.js only with Turbopack (no schedulers) - `pnpm dev:next-only` - Start Next.js only with Turbopack (no schedulers)
- `pnpm build` - Build production application - `pnpm build` - Build production application
- `pnpm start` - Run production server - `pnpm start` - Run production server
**Code Quality:** **Code Quality:**
- `pnpm lint` - Run ESLint - `pnpm lint` - Run ESLint
- `pnpm lint:fix` - Fix ESLint issues automatically - `pnpm lint:fix` - Fix ESLint issues automatically
- `pnpm format` - Format code with Prettier - `pnpm format` - Format code with Prettier
- `pnpm format:check` - Check formatting without fixing - `pnpm format:check` - Check formatting without fixing
**Database:** **Database:**
- `pnpm prisma:generate` - Generate Prisma client - `pnpm prisma:generate` - Generate Prisma client
- `pnpm prisma:migrate` - Run database migrations - `pnpm prisma:migrate` - Run database migrations
- `pnpm prisma:push` - Push schema changes to database - `pnpm prisma:push` - Push schema changes to database
@ -25,11 +28,13 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
- `pnpm prisma:studio` - Open Prisma Studio database viewer - `pnpm prisma:studio` - Open Prisma Studio database viewer
**Testing:** **Testing:**
- `pnpm test` - Run tests once - `pnpm test` - Run tests once
- `pnpm test:watch` - Run tests in watch mode - `pnpm test:watch` - Run tests in watch mode
- `pnpm test:coverage` - Run tests with coverage report - `pnpm test:coverage` - Run tests with coverage report
**Markdown:** **Markdown:**
- `pnpm lint:md` - Lint Markdown files - `pnpm lint:md` - Lint Markdown files
- `pnpm lint:md:fix` - Fix Markdown linting issues - `pnpm lint:md:fix` - Fix Markdown linting issues
@ -38,6 +43,7 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
**LiveDash-Node** is a real-time analytics dashboard for monitoring user sessions with AI-powered analysis and processing pipeline. **LiveDash-Node** is a real-time analytics dashboard for monitoring user sessions with AI-powered analysis and processing pipeline.
### Tech Stack ### Tech Stack
- **Frontend:** Next.js 15 + React 19 + TailwindCSS 4 - **Frontend:** Next.js 15 + React 19 + TailwindCSS 4
- **Backend:** Next.js API Routes + Custom Node.js server - **Backend:** Next.js API Routes + Custom Node.js server
- **Database:** PostgreSQL with Prisma ORM - **Database:** PostgreSQL with Prisma ORM
@ -50,6 +56,7 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
**1. Multi-Stage Processing Pipeline** **1. Multi-Stage Processing Pipeline**
The system processes user sessions through distinct stages tracked in `SessionProcessingStatus`: The system processes user sessions through distinct stages tracked in `SessionProcessingStatus`:
- `CSV_IMPORT` - Import raw CSV data into `SessionImport` - `CSV_IMPORT` - Import raw CSV data into `SessionImport`
- `TRANSCRIPT_FETCH` - Fetch transcript content from URLs - `TRANSCRIPT_FETCH` - Fetch transcript content from URLs
- `SESSION_CREATION` - Create normalized `Session` and `Message` records - `SESSION_CREATION` - Create normalized `Session` and `Message` records
@ -57,6 +64,7 @@ The system processes user sessions through distinct stages tracked in `SessionPr
- `QUESTION_EXTRACTION` - Extract questions from conversations - `QUESTION_EXTRACTION` - Extract questions from conversations
**2. Database Architecture** **2. Database Architecture**
- **Multi-tenant design** with `Company` as root entity - **Multi-tenant design** with `Company` as root entity
- **Dual storage pattern**: Raw CSV data in `SessionImport`, processed data in `Session` - **Dual storage pattern**: Raw CSV data in `SessionImport`, processed data in `Session`
- **1-to-1 relationship** between `SessionImport` and `Session` via `importId` - **1-to-1 relationship** between `SessionImport` and `Session` via `importId`
@ -65,11 +73,13 @@ The system processes user sessions through distinct stages tracked in `SessionPr
- **Flexible AI model management** through `AIModel`, `AIModelPricing`, and `CompanyAIModel` - **Flexible AI model management** through `AIModel`, `AIModelPricing`, and `CompanyAIModel`
**3. Custom Server Architecture** **3. Custom Server Architecture**
- `server.ts` - Custom Next.js server with configurable scheduler initialization - `server.ts` - Custom Next.js server with configurable scheduler initialization
- Three main schedulers: CSV import, import processing, and session processing - Three main schedulers: CSV import, import processing, and session processing
- Environment-based configuration via `lib/env.ts` - Environment-based configuration via `lib/env.ts`
**4. Key Processing Libraries** **4. Key Processing Libraries**
- `lib/scheduler.ts` - CSV import scheduling - `lib/scheduler.ts` - CSV import scheduling
- `lib/importProcessor.ts` - Raw data to Session conversion - `lib/importProcessor.ts` - Raw data to Session conversion
- `lib/processingScheduler.ts` - AI analysis pipeline - `lib/processingScheduler.ts` - AI analysis pipeline
@ -80,18 +90,21 @@ The system processes user sessions through distinct stages tracked in `SessionPr
**Environment Configuration:** **Environment Configuration:**
Environment variables are managed through `lib/env.ts` with .env.local file support: Environment variables are managed through `lib/env.ts` with .env.local file support:
- Database: PostgreSQL via `DATABASE_URL` and `DATABASE_URL_DIRECT` - Database: PostgreSQL via `DATABASE_URL` and `DATABASE_URL_DIRECT`
- Authentication: `NEXTAUTH_SECRET`, `NEXTAUTH_URL` - Authentication: `NEXTAUTH_SECRET`, `NEXTAUTH_URL`
- AI Processing: `OPENAI_API_KEY` - AI Processing: `OPENAI_API_KEY`
- Schedulers: `SCHEDULER_ENABLED`, various interval configurations - Schedulers: `SCHEDULER_ENABLED`, various interval configurations
**Key Files to Understand:** **Key Files to Understand:**
- `prisma/schema.prisma` - Complete database schema with enums and relationships - `prisma/schema.prisma` - Complete database schema with enums and relationships
- `server.ts` - Custom server entry point - `server.ts` - Custom server entry point
- `lib/env.ts` - Environment variable management and validation - `lib/env.ts` - Environment variable management and validation
- `app/` - Next.js App Router structure - `app/` - Next.js App Router structure
**Testing:** **Testing:**
- Uses Vitest for unit testing - Uses Vitest for unit testing
- Playwright for E2E testing - Playwright for E2E testing
- Test files in `tests/` directory - Test files in `tests/` directory
@ -99,16 +112,19 @@ Environment variables are managed through `lib/env.ts` with .env.local file supp
### Important Notes ### Important Notes
**Scheduler System:** **Scheduler System:**
- Schedulers are optional and controlled by `SCHEDULER_ENABLED` environment variable - Schedulers are optional and controlled by `SCHEDULER_ENABLED` environment variable
- Use `pnpm dev:next-only` to run without schedulers for pure frontend development - Use `pnpm dev:next-only` to run without schedulers for pure frontend development
- Three separate schedulers handle different pipeline stages - Three separate schedulers handle different pipeline stages
**Database Migrations:** **Database Migrations:**
- Always run `pnpm prisma:generate` after schema changes - Always run `pnpm prisma:generate` after schema changes
- Use `pnpm prisma:migrate` for production-ready migrations - Use `pnpm prisma:migrate` for production-ready migrations
- Use `pnpm prisma:push` for development schema changes - Use `pnpm prisma:push` for development schema changes
**AI Processing:** **AI Processing:**
- All AI requests are tracked for cost analysis - All AI requests are tracked for cost analysis
- Support for multiple AI models per company - Support for multiple AI models per company
- Time-based pricing management for accurate cost calculation - Time-based pricing management for accurate cost calculation

View File

@ -2,11 +2,11 @@
A real-time analytics dashboard for monitoring user sessions and interactions with interactive data visualizations and detailed metrics. A real-time analytics dashboard for monitoring user sessions and interactions with interactive data visualizations and detailed metrics.
![Next.js](https://img.shields.io/badge/dynamic/regex?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkjanat%2Flivedash-node%2Fmaster%2Fpackage.json&search=%22next%22%5Cs*%3A%5Cs*%22%5C%5E(%3F%3Cversion%3E%5Cd%2B%5C.%5Cd*).*%22&replace=%24%3Cversion%3E&logo=nextdotjs&label=Nextjs&color=%23000000) ![Next.js](<https://img.shields.io/badge/dynamic/regex?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkjanat%2Flivedash-node%2Fmaster%2Fpackage.json&search=%22next%22%5Cs*%3A%5Cs*%22%5C%5E(%3F%3Cversion%3E%5Cd%2B%5C.%5Cd*).*%22&replace=%24%3Cversion%3E&logo=nextdotjs&label=Nextjs&color=%23000000>)
![React](https://img.shields.io/badge/dynamic/regex?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkjanat%2Flivedash-node%2Fmaster%2Fpackage.json&search=%22react%22%5Cs*%3A%5Cs*%22%5C%5E(%3F%3Cversion%3E%5Cd%2B%5C.%5Cd*).*%22&replace=%24%3Cversion%3E&logo=react&label=React&color=%2361DAFB) ![React](<https://img.shields.io/badge/dynamic/regex?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkjanat%2Flivedash-node%2Fmaster%2Fpackage.json&search=%22react%22%5Cs*%3A%5Cs*%22%5C%5E(%3F%3Cversion%3E%5Cd%2B%5C.%5Cd*).*%22&replace=%24%3Cversion%3E&logo=react&label=React&color=%2361DAFB>)
![TypeScript](https://img.shields.io/badge/dynamic/regex?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkjanat%2Flivedash-node%2Fmaster%2Fpackage.json&search=%22typescript%22%5Cs*%3A%5Cs*%22%5C%5E(%3F%3Cversion%3E%5Cd%2B%5C.%5Cd*).*%22&replace=%24%3Cversion%3E&logo=typescript&label=TypeScript&color=%233178C6) ![TypeScript](<https://img.shields.io/badge/dynamic/regex?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkjanat%2Flivedash-node%2Fmaster%2Fpackage.json&search=%22typescript%22%5Cs*%3A%5Cs*%22%5C%5E(%3F%3Cversion%3E%5Cd%2B%5C.%5Cd*).*%22&replace=%24%3Cversion%3E&logo=typescript&label=TypeScript&color=%233178C6>)
![Prisma](https://img.shields.io/badge/dynamic/regex?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkjanat%2Flivedash-node%2Fmaster%2Fpackage.json&search=%22prisma%22%5Cs*%3A%5Cs*%22%5C%5E(%3F%3Cversion%3E%5Cd%2B%5C.%5Cd*).*%22&replace=%24%3Cversion%3E&logo=prisma&label=Prisma&color=%232D3748) ![Prisma](<https://img.shields.io/badge/dynamic/regex?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkjanat%2Flivedash-node%2Fmaster%2Fpackage.json&search=%22prisma%22%5Cs*%3A%5Cs*%22%5C%5E(%3F%3Cversion%3E%5Cd%2B%5C.%5Cd*).*%22&replace=%24%3Cversion%3E&logo=prisma&label=Prisma&color=%232D3748>)
![TailwindCSS](https://img.shields.io/badge/dynamic/regex?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkjanat%2Flivedash-node%2Fmaster%2Fpackage.json&search=%22tailwindcss%22%5Cs*%3A%5Cs*%22%5C%5E(%3F%3Cversion%3E%5Cd%2B%5C.%5Cd*).*%22&replace=%24%3Cversion%3E&logo=tailwindcss&label=TailwindCSS&color=%2306B6D4) ![TailwindCSS](<https://img.shields.io/badge/dynamic/regex?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkjanat%2Flivedash-node%2Fmaster%2Fpackage.json&search=%22tailwindcss%22%5Cs*%3A%5Cs*%22%5C%5E(%3F%3Cversion%3E%5Cd%2B%5C.%5Cd*).*%22&replace=%24%3Cversion%3E&logo=tailwindcss&label=TailwindCSS&color=%2306B6D4>)
## Features ## Features
@ -31,36 +31,36 @@ A real-time analytics dashboard for monitoring user sessions and interactions wi
### Prerequisites ### Prerequisites
- Node.js (LTS version recommended) - Node.js (LTS version recommended)
- npm or yarn - pnpm (recommended package manager)
### Installation ### Installation
1. Clone this repository: 1. Clone this repository:
```bash ```bash
git clone https://github.com/kjanat/livedash-node.git git clone https://github.com/kjanat/livedash-node.git
cd livedash-node cd livedash-node
``` ```
2. Install dependencies: 2. Install dependencies:
```bash ```bash
npm install pnpm install
``` ```
3. Set up the database: 3. Set up the database:
```bash ```bash
npm run prisma:generate pnpm run prisma:generate
npm run prisma:migrate pnpm run prisma:migrate
npm run prisma:seed pnpm run prisma:seed
``` ```
4. Start the development server: 4. Start the development server:
```bash ```bash
npm run dev pnpm run dev
``` ```
5. Open your browser and navigate to <http://localhost:3000> 5. Open your browser and navigate to <http://localhost:3000>
@ -86,12 +86,12 @@ NEXTAUTH_SECRET=your-secret-here
## Available Scripts ## Available Scripts
- `npm run dev`: Start the development server - `pnpm run dev`: Start the development server
- `npm run build`: Build the application for production - `pnpm run build`: Build the application for production
- `npm run start`: Run the production build - `pnpm run start`: Run the production build
- `npm run lint`: Run ESLint - `pnpm run lint`: Run ESLint
- `npm run format`: Format code with Prettier - `pnpm run format`: Format code with Prettier
- `npm run prisma:studio`: Open Prisma Studio to view database - `pnpm run prisma:studio`: Open Prisma Studio to view database
## Contributing ## Contributing

View File

@ -37,12 +37,11 @@ export async function POST(request: NextRequest) {
); );
} }
const company = await prisma.company.findUnique({ where: { id: companyId } }); const company = await prisma.company.findUnique({
where: { id: companyId },
});
if (!company) { if (!company) {
return NextResponse.json( return NextResponse.json({ error: "Company not found" }, { status: 404 });
{ error: "Company not found" },
{ status: 404 }
);
} }
const rawSessionData = await fetchAndParseCsv( const rawSessionData = await fetchAndParseCsv(
@ -114,12 +113,12 @@ export async function POST(request: NextRequest) {
} }
// Immediately process the queued imports to create Session records // Immediately process the queued imports to create Session records
console.log('[Refresh API] Processing queued imports...'); console.log("[Refresh API] Processing queued imports...");
await processQueuedImports(100); // Process up to 100 imports immediately await processQueuedImports(100); // Process up to 100 imports immediately
// Count how many sessions were created // Count how many sessions were created
const sessionCount = await prisma.session.count({ const sessionCount = await prisma.session.count({
where: { companyId: company.id } where: { companyId: company.id },
}); });
return NextResponse.json({ return NextResponse.json({
@ -127,7 +126,7 @@ export async function POST(request: NextRequest) {
imported: importedCount, imported: importedCount,
total: rawSessionData.length, total: rawSessionData.length,
sessions: sessionCount, sessions: sessionCount,
message: `Successfully imported ${importedCount} records and processed them into sessions. Total sessions: ${sessionCount}` message: `Successfully imported ${importedCount} records and processed them into sessions. Total sessions: ${sessionCount}`,
}); });
} catch (e) { } catch (e) {
const error = e instanceof Error ? e.message : "An unknown error occurred"; const error = e instanceof Error ? e.message : "An unknown error occurred";

View File

@ -45,18 +45,21 @@ export async function POST(request: NextRequest) {
const { batchSize, maxConcurrency } = body; const { batchSize, maxConcurrency } = body;
// Validate parameters // Validate parameters
const validatedBatchSize = batchSize && batchSize > 0 ? parseInt(batchSize) : null; const validatedBatchSize =
const validatedMaxConcurrency = maxConcurrency && maxConcurrency > 0 ? parseInt(maxConcurrency) : 5; batchSize && batchSize > 0 ? parseInt(batchSize) : null;
const validatedMaxConcurrency =
maxConcurrency && maxConcurrency > 0 ? parseInt(maxConcurrency) : 5;
// Check how many sessions need AI processing using the new status system // Check how many sessions need AI processing using the new status system
const sessionsNeedingAI = await ProcessingStatusManager.getSessionsNeedingProcessing( const sessionsNeedingAI =
await ProcessingStatusManager.getSessionsNeedingProcessing(
ProcessingStage.AI_ANALYSIS, ProcessingStage.AI_ANALYSIS,
1000 // Get count only 1000 // Get count only
); );
// Filter to sessions for this company // Filter to sessions for this company
const companySessionsNeedingAI = sessionsNeedingAI.filter( const companySessionsNeedingAI = sessionsNeedingAI.filter(
statusRecord => statusRecord.session.companyId === user.companyId (statusRecord) => statusRecord.session.companyId === user.companyId
); );
const unprocessedCount = companySessionsNeedingAI.length; const unprocessedCount = companySessionsNeedingAI.length;
@ -77,10 +80,15 @@ export async function POST(request: NextRequest) {
// The processing will continue in the background // The processing will continue in the background
processUnprocessedSessions(validatedBatchSize, validatedMaxConcurrency) processUnprocessedSessions(validatedBatchSize, validatedMaxConcurrency)
.then(() => { .then(() => {
console.log(`[Manual Trigger] Processing completed for company ${user.companyId}`); console.log(
`[Manual Trigger] Processing completed for company ${user.companyId}`
);
}) })
.catch((error) => { .catch((error) => {
console.error(`[Manual Trigger] Processing failed for company ${user.companyId}:`, error); console.error(
`[Manual Trigger] Processing failed for company ${user.companyId}:`,
error
);
}); });
return NextResponse.json({ return NextResponse.json({
@ -91,7 +99,6 @@ export async function POST(request: NextRequest) {
maxConcurrency: validatedMaxConcurrency, maxConcurrency: validatedMaxConcurrency,
startedAt: new Date().toISOString(), startedAt: new Date().toISOString(),
}); });
} catch (error) { } catch (error) {
console.error("[Manual Trigger] Error:", error); console.error("[Manual Trigger] Error:", error);
return NextResponse.json( return NextResponse.json(

View File

@ -42,7 +42,7 @@ export async function GET(request: NextRequest) {
if (startDate && endDate) { if (startDate && endDate) {
whereClause.startTime = { whereClause.startTime = {
gte: new Date(startDate), gte: new Date(startDate),
lte: new Date(endDate + 'T23:59:59.999Z'), // Include full end date lte: new Date(endDate + "T23:59:59.999Z"), // Include full end date
}; };
} }
@ -94,10 +94,12 @@ export async function GET(request: NextRequest) {
// Calculate date range from sessions // Calculate date range from sessions
let dateRange: { minDate: string; maxDate: string } | null = null; let dateRange: { minDate: string; maxDate: string } | null = null;
if (prismaSessions.length > 0) { if (prismaSessions.length > 0) {
const dates = prismaSessions.map(s => new Date(s.startTime)).sort((a, b) => a.getTime() - b.getTime()); const dates = prismaSessions
.map((s) => new Date(s.startTime))
.sort((a, b) => a.getTime() - b.getTime());
dateRange = { dateRange = {
minDate: dates[0].toISOString().split('T')[0], // First session date minDate: dates[0].toISOString().split("T")[0], // First session date
maxDate: dates[dates.length - 1].toISOString().split('T')[0] // Last session date maxDate: dates[dates.length - 1].toISOString().split("T")[0], // Last session date
}; };
} }

View File

@ -55,7 +55,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ return NextResponse.json({
categories: distinctCategories, categories: distinctCategories,
languages: distinctLanguages languages: distinctLanguages,
}); });
} catch (error) { } catch (error) {
const errorMessage = const errorMessage =

View File

@ -26,10 +26,7 @@ export async function GET(
}); });
if (!prismaSession) { if (!prismaSession) {
return NextResponse.json( return NextResponse.json({ error: "Session not found" }, { status: 404 });
{ error: "Session not found" },
{ status: 404 }
);
} }
// Map Prisma session object to ChatSession type // Map Prisma session object to ChatSession type

View File

@ -87,9 +87,7 @@ export async function GET(request: NextRequest) {
| Prisma.SessionOrderByWithRelationInput[]; | Prisma.SessionOrderByWithRelationInput[];
const primarySortField = const primarySortField =
sortKey && validSortKeys[sortKey] sortKey && validSortKeys[sortKey] ? validSortKeys[sortKey] : "startTime"; // Default to startTime field if sortKey is invalid/missing
? validSortKeys[sortKey]
: "startTime"; // Default to startTime field if sortKey is invalid/missing
const primarySortOrder = const primarySortOrder =
sortOrder === "asc" || sortOrder === "desc" ? sortOrder : "desc"; // Default to desc order sortOrder === "asc" || sortOrder === "desc" ? sortOrder : "desc"; // Default to desc order

View File

@ -1,28 +1,92 @@
import { NextRequest, NextResponse } from "next/server"; import { NextRequest, NextResponse } from "next/server";
import { prisma } from "../../../lib/prisma"; import { prisma } from "../../../lib/prisma";
import { sendEmail } from "../../../lib/sendEmail"; import { sendEmail } from "../../../lib/sendEmail";
import { forgotPasswordSchema, validateInput } from "../../../lib/validation";
import crypto from "crypto"; import crypto from "crypto";
export async function POST(request: NextRequest) { // In-memory rate limiting for password reset requests
const body = await request.json(); const resetAttempts = new Map<string, { count: number; resetTime: number }>();
const { email } = body as { email: string };
const user = await prisma.user.findUnique({ where: { email } }); function checkRateLimit(ip: string): boolean {
if (!user) { const now = Date.now();
// Always return 200 for privacy (don't reveal if email exists) const attempts = resetAttempts.get(ip);
return NextResponse.json({ success: true }, { status: 200 });
if (!attempts || now > attempts.resetTime) {
resetAttempts.set(ip, { count: 1, resetTime: now + 15 * 60 * 1000 }); // 15 minute window
return true;
} }
if (attempts.count >= 5) {
// Max 5 reset requests per 15 minutes per IP
return false;
}
attempts.count++;
return true;
}
export async function POST(request: NextRequest) {
try {
// Rate limiting check
const ip =
request.ip || request.headers.get("x-forwarded-for") || "unknown";
if (!checkRateLimit(ip)) {
return NextResponse.json(
{
success: false,
error: "Too many password reset attempts. Please try again later.",
},
{ status: 429 }
);
}
const body = await request.json();
// Validate input
const validation = validateInput(forgotPasswordSchema, body);
if (!validation.success) {
return NextResponse.json(
{
success: false,
error: "Invalid email format",
},
{ status: 400 }
);
}
const { email } = validation.data;
const user = await prisma.user.findUnique({ where: { email } });
// Always return success for privacy (don't reveal if email exists)
// But only send email if user exists
if (user) {
const token = crypto.randomBytes(32).toString("hex"); const token = crypto.randomBytes(32).toString("hex");
const tokenHash = crypto.createHash("sha256").update(token).digest("hex");
const expiry = new Date(Date.now() + 1000 * 60 * 30); // 30 min expiry const expiry = new Date(Date.now() + 1000 * 60 * 30); // 30 min expiry
await prisma.user.update({ await prisma.user.update({
where: { email }, where: { email },
data: { resetToken: token, resetTokenExpiry: expiry }, data: { resetToken: tokenHash, resetTokenExpiry: expiry },
}); });
const resetUrl = `${process.env.NEXTAUTH_URL || "http://localhost:3000"}/reset-password?token=${token}`; const resetUrl = `${process.env.NEXTAUTH_URL || "http://localhost:3000"}/reset-password?token=${token}`;
await sendEmail(email, "Password Reset", `Reset your password: ${resetUrl}`); await sendEmail(
email,
"Password Reset",
`Reset your password: ${resetUrl}`
);
}
return NextResponse.json({ success: true }, { status: 200 }); return NextResponse.json({ success: true }, { status: 200 });
} catch (error) {
console.error("Forgot password error:", error);
return NextResponse.json(
{
success: false,
error: "Internal server error",
},
{ status: 500 }
);
}
} }

View File

@ -1,34 +1,70 @@
import { NextRequest, NextResponse } from "next/server"; import { NextRequest, NextResponse } from "next/server";
import { prisma } from "../../../lib/prisma"; import { prisma } from "../../../lib/prisma";
import { registerSchema, validateInput } from "../../../lib/validation";
import bcrypt from "bcryptjs"; import bcrypt from "bcryptjs";
interface RegisterRequestBody { // In-memory rate limiting (for production, use Redis or similar)
email: string; const registrationAttempts = new Map<
password: string; string,
company: string; { count: number; resetTime: number }
csvUrl?: string; >();
function checkRateLimit(ip: string): boolean {
const now = Date.now();
const attempts = registrationAttempts.get(ip);
if (!attempts || now > attempts.resetTime) {
registrationAttempts.set(ip, { count: 1, resetTime: now + 60 * 60 * 1000 }); // 1 hour window
return true;
}
if (attempts.count >= 3) {
// Max 3 registrations per hour per IP
return false;
}
attempts.count++;
return true;
} }
export async function POST(request: NextRequest) { export async function POST(request: NextRequest) {
const body = await request.json(); try {
const { email, password, company, csvUrl } = body as RegisterRequestBody; // Rate limiting check
const ip =
if (!email || !password || !company) { request.ip || request.headers.get("x-forwarded-for") || "unknown";
if (!checkRateLimit(ip)) {
return NextResponse.json( return NextResponse.json(
{ {
success: false, success: false,
error: "Missing required fields", error: "Too many registration attempts. Please try again later.",
},
{ status: 429 }
);
}
const body = await request.json();
// Validate input with Zod schema
const validation = validateInput(registerSchema, body);
if (!validation.success) {
return NextResponse.json(
{
success: false,
error: "Validation failed",
details: validation.errors,
}, },
{ status: 400 } { status: 400 }
); );
} }
const { email, password, company } = validation.data;
// Check if email exists // Check if email exists
const exists = await prisma.user.findUnique({ const existingUser = await prisma.user.findUnique({
where: { email }, where: { email },
}); });
if (exists) { if (existingUser) {
return NextResponse.json( return NextResponse.json(
{ {
success: false, success: false,
@ -38,26 +74,63 @@ export async function POST(request: NextRequest) {
); );
} }
const newCompany = await prisma.company.create({ // Check if company name already exists
data: { name: company, csvUrl: csvUrl || "" }, const existingCompany = await prisma.company.findFirst({
where: { name: company },
}); });
const hashed = await bcrypt.hash(password, 10); if (existingCompany) {
return NextResponse.json(
{
success: false,
error: "Company name already exists",
},
{ status: 409 }
);
}
await prisma.user.create({ // Create company and user in a transaction
const result = await prisma.$transaction(async (tx) => {
const newCompany = await tx.company.create({
data: {
name: company,
csvUrl: "", // Empty by default, can be set later in settings
},
});
const hashedPassword = await bcrypt.hash(password, 12); // Increased rounds for better security
const newUser = await tx.user.create({
data: { data: {
email, email,
password: hashed, password: hashedPassword,
companyId: newCompany.id, companyId: newCompany.id,
role: "ADMIN", role: "USER", // Changed from ADMIN - users should be promoted by existing admins
}, },
}); });
return { company: newCompany, user: newUser };
});
return NextResponse.json( return NextResponse.json(
{ {
success: true, success: true,
data: { success: true }, data: {
message: "Registration successful",
userId: result.user.id,
companyId: result.company.id,
},
}, },
{ status: 201 } { status: 201 }
); );
} catch (error) {
console.error("Registration error:", error);
return NextResponse.json(
{
success: false,
error: "Internal server error",
},
{ status: 500 }
);
}
} }

View File

@ -1,29 +1,34 @@
import { NextRequest, NextResponse } from "next/server"; import { NextRequest, NextResponse } from "next/server";
import { prisma } from "../../../lib/prisma"; import { prisma } from "../../../lib/prisma";
import { resetPasswordSchema, validateInput } from "../../../lib/validation";
import bcrypt from "bcryptjs"; import bcrypt from "bcryptjs";
import crypto from "crypto";
export async function POST(request: NextRequest) { export async function POST(request: NextRequest) {
const body = await request.json();
const { token, password } = body as { token?: string; password?: string };
if (!token || !password) {
return NextResponse.json(
{ error: "Token and password are required." },
{ status: 400 }
);
}
if (password.length < 8) {
return NextResponse.json(
{ error: "Password must be at least 8 characters long." },
{ status: 400 }
);
}
try { try {
const body = await request.json();
// Validate input with strong password requirements
const validation = validateInput(resetPasswordSchema, body);
if (!validation.success) {
return NextResponse.json(
{
success: false,
error: "Validation failed",
details: validation.errors,
},
{ status: 400 }
);
}
const { token, password } = validation.data;
// Hash the token to compare with stored hash
const tokenHash = crypto.createHash("sha256").update(token).digest("hex");
const user = await prisma.user.findFirst({ const user = await prisma.user.findFirst({
where: { where: {
resetToken: token, resetToken: tokenHash,
resetTokenExpiry: { gte: new Date() }, resetTokenExpiry: { gte: new Date() },
}, },
}); });
@ -31,30 +36,38 @@ export async function POST(request: NextRequest) {
if (!user) { if (!user) {
return NextResponse.json( return NextResponse.json(
{ {
error: "Invalid or expired token. Please request a new password reset.", success: false,
error:
"Invalid or expired token. Please request a new password reset.",
}, },
{ status: 400 } { status: 400 }
); );
} }
const hash = await bcrypt.hash(password, 10); // Hash password with higher rounds for better security
const hashedPassword = await bcrypt.hash(password, 12);
await prisma.user.update({ await prisma.user.update({
where: { id: user.id }, where: { id: user.id },
data: { data: {
password: hash, password: hashedPassword,
resetToken: null, resetToken: null,
resetTokenExpiry: null, resetTokenExpiry: null,
}, },
}); });
return NextResponse.json( return NextResponse.json(
{ message: "Password has been reset successfully." }, {
success: true,
message: "Password has been reset successfully.",
},
{ status: 200 } { status: 200 }
); );
} catch (error) { } catch (error) {
console.error("Reset password error:", error); console.error("Reset password error:", error);
return NextResponse.json( return NextResponse.json(
{ {
success: false,
error: "An internal server error occurred. Please try again later.", error: "An internal server error occurred. Please try again later.",
}, },
{ status: 500 } { status: 500 }

View File

@ -48,7 +48,10 @@ function DashboardContent() {
const [company, setCompany] = useState<Company | null>(null); const [company, setCompany] = useState<Company | null>(null);
const [loading, setLoading] = useState<boolean>(false); const [loading, setLoading] = useState<boolean>(false);
const [refreshing, setRefreshing] = useState<boolean>(false); const [refreshing, setRefreshing] = useState<boolean>(false);
const [dateRange, setDateRange] = useState<{ minDate: string; maxDate: string } | null>(null); const [dateRange, setDateRange] = useState<{
minDate: string;
maxDate: string;
} | null>(null);
const [selectedStartDate, setSelectedStartDate] = useState<string>(""); const [selectedStartDate, setSelectedStartDate] = useState<string>("");
const [selectedEndDate, setSelectedEndDate] = useState<string>(""); const [selectedEndDate, setSelectedEndDate] = useState<string>("");
const [isInitialLoad, setIsInitialLoad] = useState<boolean>(true); const [isInitialLoad, setIsInitialLoad] = useState<boolean>(true);
@ -56,7 +59,11 @@ function DashboardContent() {
const isAuditor = session?.user?.role === "AUDITOR"; const isAuditor = session?.user?.role === "AUDITOR";
// Function to fetch metrics with optional date range // Function to fetch metrics with optional date range
const fetchMetrics = async (startDate?: string, endDate?: string, isInitial = false) => { const fetchMetrics = async (
startDate?: string,
endDate?: string,
isInitial = false
) => {
setLoading(true); setLoading(true);
try { try {
let url = "/api/dashboard/metrics"; let url = "/api/dashboard/metrics";
@ -85,11 +92,14 @@ function DashboardContent() {
}; };
// Handle date range changes // Handle date range changes
const handleDateRangeChange = useCallback((startDate: string, endDate: string) => { const handleDateRangeChange = useCallback(
(startDate: string, endDate: string) => {
setSelectedStartDate(startDate); setSelectedStartDate(startDate);
setSelectedEndDate(endDate); setSelectedEndDate(endDate);
fetchMetrics(startDate, endDate); fetchMetrics(startDate, endDate);
}, []); },
[]
);
useEffect(() => { useEffect(() => {
// Redirect if not authenticated // Redirect if not authenticated
@ -216,9 +226,21 @@ function DashboardContent() {
}; };
return [ return [
{ name: "Positive", value: sentimentData.positive, color: "hsl(var(--chart-1))" }, {
{ name: "Neutral", value: sentimentData.neutral, color: "hsl(var(--chart-2))" }, name: "Positive",
{ name: "Negative", value: sentimentData.negative, color: "hsl(var(--chart-3))" }, value: sentimentData.positive,
color: "hsl(var(--chart-1))",
},
{
name: "Neutral",
value: sentimentData.neutral,
color: "hsl(var(--chart-2))",
},
{
name: "Negative",
value: sentimentData.negative,
color: "hsl(var(--chart-3))",
},
]; ];
}; };
@ -226,7 +248,10 @@ function DashboardContent() {
if (!metrics?.days) return []; if (!metrics?.days) return [];
return Object.entries(metrics.days).map(([date, value]) => ({ return Object.entries(metrics.days).map(([date, value]) => ({
date: new Date(date).toLocaleDateString('en-US', { month: 'short', day: 'numeric' }), date: new Date(date).toLocaleDateString("en-US", {
month: "short",
day: "numeric",
}),
value: value as number, value: value as number,
})); }));
}; };
@ -235,7 +260,7 @@ function DashboardContent() {
if (!metrics?.categories) return []; if (!metrics?.categories) return [];
return Object.entries(metrics.categories).map(([name, value]) => ({ return Object.entries(metrics.categories).map(([name, value]) => ({
name: name.length > 15 ? name.substring(0, 15) + '...' : name, name: name.length > 15 ? name.substring(0, 15) + "..." : name,
value: value as number, value: value as number,
})); }));
}; };
@ -287,7 +312,9 @@ function DashboardContent() {
<div className="flex flex-col sm:flex-row justify-between items-start sm:items-center gap-4"> <div className="flex flex-col sm:flex-row justify-between items-start sm:items-center gap-4">
<div className="space-y-2"> <div className="space-y-2">
<div className="flex items-center gap-3"> <div className="flex items-center gap-3">
<h1 className="text-3xl font-bold tracking-tight">{company.name}</h1> <h1 className="text-3xl font-bold tracking-tight">
{company.name}
</h1>
<Badge variant="secondary" className="text-xs"> <Badge variant="secondary" className="text-xs">
Analytics Dashboard Analytics Dashboard
</Badge> </Badge>
@ -307,7 +334,9 @@ function DashboardContent() {
size="sm" size="sm"
className="gap-2" className="gap-2"
> >
<RefreshCw className={`h-4 w-4 ${refreshing ? 'animate-spin' : ''}`} /> <RefreshCw
className={`h-4 w-4 ${refreshing ? "animate-spin" : ""}`}
/>
{refreshing ? "Refreshing..." : "Refresh"} {refreshing ? "Refreshing..." : "Refresh"}
</Button> </Button>
@ -318,7 +347,9 @@ function DashboardContent() {
</Button> </Button>
</DropdownMenuTrigger> </DropdownMenuTrigger>
<DropdownMenuContent align="end"> <DropdownMenuContent align="end">
<DropdownMenuItem onClick={() => signOut({ callbackUrl: "/login" })}> <DropdownMenuItem
onClick={() => signOut({ callbackUrl: "/login" })}
>
<LogOut className="h-4 w-4 mr-2" /> <LogOut className="h-4 w-4 mr-2" />
Sign out Sign out
</DropdownMenuItem> </DropdownMenuItem>
@ -387,27 +418,32 @@ function DashboardContent() {
<MetricCard <MetricCard
title="Daily Costs" title="Daily Costs"
value={`${metrics.avgDailyCosts?.toFixed(4) || '0.0000'}`} value={`${metrics.avgDailyCosts?.toFixed(4) || "0.0000"}`}
icon={<Euro className="h-5 w-5" />} icon={<Euro className="h-5 w-5" />}
description="Average per day" description="Average per day"
/> />
<MetricCard <MetricCard
title="Peak Usage" title="Peak Usage"
value={metrics.peakUsageTime || 'N/A'} value={metrics.peakUsageTime || "N/A"}
icon={<TrendingUp className="h-5 w-5" />} icon={<TrendingUp className="h-5 w-5" />}
description="Busiest hour" description="Busiest hour"
/> />
<MetricCard <MetricCard
title="Resolution Rate" title="Resolution Rate"
value={`${metrics.resolvedChatsPercentage?.toFixed(1) || '0.0'}%`} value={`${metrics.resolvedChatsPercentage?.toFixed(1) || "0.0"}%`}
icon={<CheckCircle className="h-5 w-5" />} icon={<CheckCircle className="h-5 w-5" />}
trend={{ trend={{
value: metrics.resolvedChatsPercentage ?? 0, value: metrics.resolvedChatsPercentage ?? 0,
isPositive: (metrics.resolvedChatsPercentage ?? 0) >= 80, isPositive: (metrics.resolvedChatsPercentage ?? 0) >= 80,
}} }}
variant={metrics.resolvedChatsPercentage && metrics.resolvedChatsPercentage >= 80 ? "success" : "warning"} variant={
metrics.resolvedChatsPercentage &&
metrics.resolvedChatsPercentage >= 80
? "success"
: "warning"
}
/> />
<MetricCard <MetricCard
@ -508,8 +544,8 @@ function DashboardContent() {
{metrics.totalTokens?.toLocaleString() || 0} {metrics.totalTokens?.toLocaleString() || 0}
</Badge> </Badge>
<Badge variant="outline" className="gap-1"> <Badge variant="outline" className="gap-1">
<span className="font-semibold">Total Cost:</span> <span className="font-semibold">Total Cost:</span>
{metrics.totalTokensEur?.toFixed(4) || 0} {metrics.totalTokensEur?.toFixed(4) || 0}
</Badge> </Badge>
</div> </div>
</div> </div>

View File

@ -46,7 +46,8 @@ const DashboardPage: FC = () => {
const navigationCards = [ const navigationCards = [
{ {
title: "Analytics Overview", title: "Analytics Overview",
description: "View comprehensive metrics, charts, and insights from your chat sessions", description:
"View comprehensive metrics, charts, and insights from your chat sessions",
icon: <BarChart3 className="h-6 w-6" />, icon: <BarChart3 className="h-6 w-6" />,
href: "/dashboard/overview", href: "/dashboard/overview",
variant: "primary" as const, variant: "primary" as const,
@ -54,7 +55,8 @@ const DashboardPage: FC = () => {
}, },
{ {
title: "Session Browser", title: "Session Browser",
description: "Browse, search, and analyze individual conversation sessions", description:
"Browse, search, and analyze individual conversation sessions",
icon: <MessageSquare className="h-6 w-6" />, icon: <MessageSquare className="h-6 w-6" />,
href: "/dashboard/sessions", href: "/dashboard/sessions",
variant: "success" as const, variant: "success" as const,
@ -64,16 +66,22 @@ const DashboardPage: FC = () => {
? [ ? [
{ {
title: "Company Settings", title: "Company Settings",
description: "Configure company settings, integrations, and API connections", description:
"Configure company settings, integrations, and API connections",
icon: <Settings className="h-6 w-6" />, icon: <Settings className="h-6 w-6" />,
href: "/dashboard/company", href: "/dashboard/company",
variant: "warning" as const, variant: "warning" as const,
features: ["API configuration", "Integration settings", "Data management"], features: [
"API configuration",
"Integration settings",
"Data management",
],
adminOnly: true, adminOnly: true,
}, },
{ {
title: "User Management", title: "User Management",
description: "Invite team members and manage user accounts and permissions", description:
"Invite team members and manage user accounts and permissions",
icon: <Users className="h-6 w-6" />, icon: <Users className="h-6 w-6" />,
href: "/dashboard/users", href: "/dashboard/users",
variant: "default" as const, variant: "default" as const,
@ -186,7 +194,10 @@ const DashboardPage: FC = () => {
{/* Features List */} {/* Features List */}
<div className="space-y-2"> <div className="space-y-2">
{card.features.map((feature, featureIndex) => ( {card.features.map((feature, featureIndex) => (
<div key={featureIndex} className="flex items-center gap-2 text-sm"> <div
key={featureIndex}
className="flex items-center gap-2 text-sm"
>
<div className="h-1.5 w-1.5 rounded-full bg-current opacity-60" /> <div className="h-1.5 w-1.5 rounded-full bg-current opacity-60" />
<span className="text-muted-foreground">{feature}</span> <span className="text-muted-foreground">{feature}</span>
</div> </div>

View File

@ -1,11 +1,11 @@
import { PrismaClient } from '@prisma/client'; import { PrismaClient } from "@prisma/client";
import { ProcessingStatusManager } from './lib/processingStatusManager'; import { ProcessingStatusManager } from "./lib/processingStatusManager";
const prisma = new PrismaClient(); const prisma = new PrismaClient();
async function checkRefactoredPipelineStatus() { async function checkRefactoredPipelineStatus() {
try { try {
console.log('=== REFACTORED PIPELINE STATUS ===\n'); console.log("=== REFACTORED PIPELINE STATUS ===\n");
// Get pipeline status using the new system // Get pipeline status using the new system
const pipelineStatus = await ProcessingStatusManager.getPipelineStatus(); const pipelineStatus = await ProcessingStatusManager.getPipelineStatus();
@ -13,7 +13,13 @@ async function checkRefactoredPipelineStatus() {
console.log(`Total Sessions: ${pipelineStatus.totalSessions}\n`); console.log(`Total Sessions: ${pipelineStatus.totalSessions}\n`);
// Display status for each stage // Display status for each stage
const stages = ['CSV_IMPORT', 'TRANSCRIPT_FETCH', 'SESSION_CREATION', 'AI_ANALYSIS', 'QUESTION_EXTRACTION']; const stages = [
"CSV_IMPORT",
"TRANSCRIPT_FETCH",
"SESSION_CREATION",
"AI_ANALYSIS",
"QUESTION_EXTRACTION",
];
for (const stage of stages) { for (const stage of stages) {
console.log(`${stage}:`); console.log(`${stage}:`);
@ -30,11 +36,11 @@ async function checkRefactoredPipelineStatus() {
console.log(` COMPLETED: ${completed}`); console.log(` COMPLETED: ${completed}`);
console.log(` FAILED: ${failed}`); console.log(` FAILED: ${failed}`);
console.log(` SKIPPED: ${skipped}`); console.log(` SKIPPED: ${skipped}`);
console.log(''); console.log("");
} }
// Show what needs processing // Show what needs processing
console.log('=== WHAT NEEDS PROCESSING ==='); console.log("=== WHAT NEEDS PROCESSING ===");
for (const stage of stages) { for (const stage of stages) {
const stageData = pipelineStatus.pipeline[stage] || {}; const stageData = pipelineStatus.pipeline[stage] || {};
@ -49,27 +55,36 @@ async function checkRefactoredPipelineStatus() {
// Show failed sessions if any // Show failed sessions if any
const failedSessions = await ProcessingStatusManager.getFailedSessions(); const failedSessions = await ProcessingStatusManager.getFailedSessions();
if (failedSessions.length > 0) { if (failedSessions.length > 0) {
console.log('\n=== FAILED SESSIONS ==='); console.log("\n=== FAILED SESSIONS ===");
failedSessions.slice(0, 5).forEach(failure => { failedSessions.slice(0, 5).forEach((failure) => {
console.log(` ${failure.session.import?.externalSessionId || failure.sessionId}: ${failure.stage} - ${failure.errorMessage}`); console.log(
` ${failure.session.import?.externalSessionId || failure.sessionId}: ${failure.stage} - ${failure.errorMessage}`
);
}); });
if (failedSessions.length > 5) { if (failedSessions.length > 5) {
console.log(` ... and ${failedSessions.length - 5} more failed sessions`); console.log(
` ... and ${failedSessions.length - 5} more failed sessions`
);
} }
} }
// Show sessions ready for AI processing // Show sessions ready for AI processing
const readyForAI = await ProcessingStatusManager.getSessionsNeedingProcessing('AI_ANALYSIS', 5); const readyForAI =
await ProcessingStatusManager.getSessionsNeedingProcessing(
"AI_ANALYSIS",
5
);
if (readyForAI.length > 0) { if (readyForAI.length > 0) {
console.log('\n=== SESSIONS READY FOR AI PROCESSING ==='); console.log("\n=== SESSIONS READY FOR AI PROCESSING ===");
readyForAI.forEach(status => { readyForAI.forEach((status) => {
console.log(` ${status.session.import?.externalSessionId || status.sessionId} (created: ${status.session.createdAt})`); console.log(
` ${status.session.import?.externalSessionId || status.sessionId} (created: ${status.session.createdAt})`
);
}); });
} }
} catch (error) { } catch (error) {
console.error('Error checking pipeline status:', error); console.error("Error checking pipeline status:", error);
} finally { } finally {
await prisma.$disconnect(); await prisma.$disconnect();
} }

View File

@ -63,10 +63,11 @@ export default function DateRangePicker({
const setLast30Days = () => { const setLast30Days = () => {
const thirtyDaysAgo = new Date(); const thirtyDaysAgo = new Date();
thirtyDaysAgo.setDate(thirtyDaysAgo.getDate() - 30); thirtyDaysAgo.setDate(thirtyDaysAgo.getDate() - 30);
const thirtyDaysAgoStr = thirtyDaysAgo.toISOString().split('T')[0]; const thirtyDaysAgoStr = thirtyDaysAgo.toISOString().split("T")[0];
// Use the later of 30 days ago or minDate // Use the later of 30 days ago or minDate
const newStartDate = thirtyDaysAgoStr > minDate ? thirtyDaysAgoStr : minDate; const newStartDate =
thirtyDaysAgoStr > minDate ? thirtyDaysAgoStr : minDate;
setStartDate(newStartDate); setStartDate(newStartDate);
setEndDate(maxDate); setEndDate(maxDate);
}; };
@ -74,7 +75,7 @@ export default function DateRangePicker({
const setLast7Days = () => { const setLast7Days = () => {
const sevenDaysAgo = new Date(); const sevenDaysAgo = new Date();
sevenDaysAgo.setDate(sevenDaysAgo.getDate() - 7); sevenDaysAgo.setDate(sevenDaysAgo.getDate() - 7);
const sevenDaysAgoStr = sevenDaysAgo.toISOString().split('T')[0]; const sevenDaysAgoStr = sevenDaysAgo.toISOString().split("T")[0];
// Use the later of 7 days ago or minDate // Use the later of 7 days ago or minDate
const newStartDate = sevenDaysAgoStr > minDate ? sevenDaysAgoStr : minDate; const newStartDate = sevenDaysAgoStr > minDate ? sevenDaysAgoStr : minDate;
@ -146,7 +147,8 @@ export default function DateRangePicker({
</div> </div>
<div className="mt-2 text-xs text-gray-500"> <div className="mt-2 text-xs text-gray-500">
Available data: {new Date(minDate).toLocaleDateString()} - {new Date(maxDate).toLocaleDateString()} Available data: {new Date(minDate).toLocaleDateString()} -{" "}
{new Date(maxDate).toLocaleDateString()}
</div> </div>
</div> </div>
); );

View File

@ -48,7 +48,7 @@ const getCountryCoordinates = (): Record<string, [number, number]> => {
BG: [42.7339, 25.4858], BG: [42.7339, 25.4858],
HR: [45.1, 15.2], HR: [45.1, 15.2],
SK: [48.669, 19.699], SK: [48.669, 19.699],
SI: [46.1512, 14.9955] SI: [46.1512, 14.9955],
}; };
// This function now primarily returns fallbacks. // This function now primarily returns fallbacks.
// The actual fetching using @rapideditor/country-coder will be in the component's useEffect. // The actual fetching using @rapideditor/country-coder will be in the component's useEffect.

View File

@ -49,7 +49,9 @@ export default function MessageViewer({ messages }: MessageViewerProps) {
{message.role} {message.role}
</span> </span>
<span className="text-xs opacity-75 ml-2"> <span className="text-xs opacity-75 ml-2">
{message.timestamp ? new Date(message.timestamp).toLocaleTimeString() : 'No timestamp'} {message.timestamp
? new Date(message.timestamp).toLocaleTimeString()
: "No timestamp"}
</span> </span>
</div> </div>
<div className="text-sm whitespace-pre-wrap"> <div className="text-sm whitespace-pre-wrap">
@ -63,13 +65,18 @@ export default function MessageViewer({ messages }: MessageViewerProps) {
<div className="mt-4 pt-3 border-t text-sm text-gray-500"> <div className="mt-4 pt-3 border-t text-sm text-gray-500">
<div className="flex justify-between"> <div className="flex justify-between">
<span> <span>
First message: {messages[0].timestamp ? new Date(messages[0].timestamp).toLocaleString() : 'No timestamp'} First message:{" "}
{messages[0].timestamp
? new Date(messages[0].timestamp).toLocaleString()
: "No timestamp"}
</span> </span>
<span> <span>
Last message:{" "} Last message:{" "}
{(() => { {(() => {
const lastMessage = messages[messages.length - 1]; const lastMessage = messages[messages.length - 1];
return lastMessage.timestamp ? new Date(lastMessage.timestamp).toLocaleString() : 'No timestamp'; return lastMessage.timestamp
? new Date(lastMessage.timestamp).toLocaleString()
: "No timestamp";
})()} })()}
</span> </span>
</div> </div>

View File

@ -1,88 +0,0 @@
"use client";
interface MetricCardProps {
title: string;
value: string | number | null | undefined;
description?: string;
icon?: React.ReactNode;
trend?: {
value: number;
label?: string;
isPositive?: boolean;
};
variant?: "default" | "primary" | "success" | "warning" | "danger";
}
export default function MetricCard({
title,
value,
description,
icon,
trend,
variant = "default",
}: MetricCardProps) {
// Determine background and text colors based on variant
const getVariantClasses = () => {
switch (variant) {
case "primary":
return "bg-blue-50 border-blue-200";
case "success":
return "bg-green-50 border-green-200";
case "warning":
return "bg-amber-50 border-amber-200";
case "danger":
return "bg-red-50 border-red-200";
default:
return "bg-white border-gray-200";
}
};
const getIconClasses = () => {
switch (variant) {
case "primary":
return "bg-blue-100 text-blue-600";
case "success":
return "bg-green-100 text-green-600";
case "warning":
return "bg-amber-100 text-amber-600";
case "danger":
return "bg-red-100 text-red-600";
default:
return "bg-gray-100 text-gray-600";
}
};
return (
<div className={`rounded-xl border shadow-sm p-6 ${getVariantClasses()}`}>
<div className="flex items-start justify-between">
<div>
<p className="text-sm font-medium text-gray-500">{title}</p>
<div className="mt-2 flex items-baseline">
<p className="text-2xl font-semibold">{value ?? "-"}</p>
{trend && (
<span
className={`ml-2 text-sm font-medium ${
trend.isPositive !== false ? "text-green-600" : "text-red-600"
}`}
>
{trend.isPositive !== false ? "↑" : "↓"}{" "}
{Math.abs(trend.value).toFixed(1)}%
</span>
)}
</div>
{description && (
<p className="mt-1 text-xs text-gray-500">{description}</p>
)}
</div>
{icon && (
<div
className={`flex h-12 w-12 rounded-full ${getIconClasses()} items-center justify-center`}
>
<span className="text-xl">{icon}</span>
</div>
)}
</div>
</div>
);
}

View File

@ -68,8 +68,10 @@ export default function ResponseTimeDistribution({
// Determine color based on response time // Determine color based on response time
let color; let color;
if (i <= 2) color = "hsl(var(--chart-1))"; // Green for fast if (i <= 2)
else if (i <= 5) color = "hsl(var(--chart-4))"; // Yellow for medium color = "hsl(var(--chart-1))"; // Green for fast
else if (i <= 5)
color = "hsl(var(--chart-4))"; // Yellow for medium
else color = "hsl(var(--chart-3))"; // Red for slow else color = "hsl(var(--chart-3))"; // Red for slow
return { return {
@ -82,7 +84,10 @@ export default function ResponseTimeDistribution({
return ( return (
<div className="h-64"> <div className="h-64">
<ResponsiveContainer width="100%" height="100%"> <ResponsiveContainer width="100%" height="100%">
<BarChart data={chartData} margin={{ top: 20, right: 30, left: 20, bottom: 5 }}> <BarChart
data={chartData}
margin={{ top: 20, right: 30, left: 20, bottom: 5 }}
>
<CartesianGrid <CartesianGrid
strokeDasharray="3 3" strokeDasharray="3 3"
stroke="hsl(var(--border))" stroke="hsl(var(--border))"
@ -101,19 +106,15 @@ export default function ResponseTimeDistribution({
tickLine={false} tickLine={false}
axisLine={false} axisLine={false}
label={{ label={{
value: 'Number of Responses', value: "Number of Responses",
angle: -90, angle: -90,
position: 'insideLeft', position: "insideLeft",
style: { textAnchor: 'middle' } style: { textAnchor: "middle" },
}} }}
/> />
<Tooltip content={<CustomTooltip />} /> <Tooltip content={<CustomTooltip />} />
<Bar <Bar dataKey="value" radius={[4, 4, 0, 0]} fill="hsl(var(--chart-1))">
dataKey="value"
radius={[4, 4, 0, 0]}
fill="hsl(var(--chart-1))"
>
{chartData.map((entry, index) => ( {chartData.map((entry, index) => (
<Bar key={`cell-${index}`} fill={entry.color} /> <Bar key={`cell-${index}`} fill={entry.color} />
))} ))}
@ -131,8 +132,8 @@ export default function ResponseTimeDistribution({
style: { style: {
fill: "hsl(var(--primary))", fill: "hsl(var(--primary))",
fontSize: "12px", fontSize: "12px",
fontWeight: "500" fontWeight: "500",
} },
}} }}
/> />
@ -149,8 +150,8 @@ export default function ResponseTimeDistribution({
style: { style: {
fill: "hsl(var(--chart-2))", fill: "hsl(var(--chart-2))",
fontSize: "12px", fontSize: "12px",
fontWeight: "500" fontWeight: "500",
} },
}} }}
/> />
)} )}

View File

@ -90,7 +90,6 @@ export default function SessionDetails({ session }: SessionDetailsProps) {
<span className="font-medium">{session.messagesSent || 0}</span> <span className="font-medium">{session.messagesSent || 0}</span>
</div> </div>
{session.avgResponseTime !== null && {session.avgResponseTime !== null &&
session.avgResponseTime !== undefined && ( session.avgResponseTime !== undefined && (
<div className="flex justify-between border-b pb-2"> <div className="flex justify-between border-b pb-2">
@ -132,7 +131,6 @@ export default function SessionDetails({ session }: SessionDetailsProps) {
</div> </div>
)} )}
{session.initialMsg && ( {session.initialMsg && (
<div className="border-b pb-2"> <div className="border-b pb-2">
<span className="text-gray-600 block mb-1">Initial Message:</span> <span className="text-gray-600 block mb-1">Initial Message:</span>
@ -151,7 +149,6 @@ export default function SessionDetails({ session }: SessionDetailsProps) {
</div> </div>
)} )}
{session.fullTranscriptUrl && ( {session.fullTranscriptUrl && (
<div className="flex justify-between pt-2"> <div className="flex justify-between pt-2">
<span className="text-gray-600">Transcript:</span> <span className="text-gray-600">Transcript:</span>

View File

@ -1,14 +1,17 @@
'use client'; "use client";
import React from 'react'; import React from "react";
import { TopQuestion } from '../lib/types'; import { TopQuestion } from "../lib/types";
interface TopQuestionsChartProps { interface TopQuestionsChartProps {
data: TopQuestion[]; data: TopQuestion[];
title?: string; title?: string;
} }
export default function TopQuestionsChart({ data, title = "Top 5 Asked Questions" }: TopQuestionsChartProps) { export default function TopQuestionsChart({
data,
title = "Top 5 Asked Questions",
}: TopQuestionsChartProps) {
if (!data || data.length === 0) { if (!data || data.length === 0) {
return ( return (
<div className="bg-white p-6 rounded-lg shadow-sm border border-gray-200"> <div className="bg-white p-6 rounded-lg shadow-sm border border-gray-200">
@ -21,7 +24,7 @@ export default function TopQuestionsChart({ data, title = "Top 5 Asked Questions
} }
// Find the maximum count to calculate relative bar widths // Find the maximum count to calculate relative bar widths
const maxCount = Math.max(...data.map(q => q.count)); const maxCount = Math.max(...data.map((q) => q.count));
return ( return (
<div className="bg-white p-6 rounded-lg shadow-sm border border-gray-200"> <div className="bg-white p-6 rounded-lg shadow-sm border border-gray-200">
@ -29,7 +32,8 @@ export default function TopQuestionsChart({ data, title = "Top 5 Asked Questions
<div className="space-y-4"> <div className="space-y-4">
{data.map((question, index) => { {data.map((question, index) => {
const percentage = maxCount > 0 ? (question.count / maxCount) * 100 : 0; const percentage =
maxCount > 0 ? (question.count / maxCount) * 100 : 0;
return ( return (
<div key={index} className="relative"> <div key={index} className="relative">

View File

@ -61,7 +61,10 @@ export default function ModernBarChart({
)} )}
<CardContent> <CardContent>
<ResponsiveContainer width="100%" height={height}> <ResponsiveContainer width="100%" height={height}>
<BarChart data={data} margin={{ top: 5, right: 30, left: 20, bottom: 5 }}> <BarChart
data={data}
margin={{ top: 5, right: 30, left: 20, bottom: 5 }}
>
<CartesianGrid <CartesianGrid
strokeDasharray="3 3" strokeDasharray="3 3"
stroke="hsl(var(--border))" stroke="hsl(var(--border))"

View File

@ -1,6 +1,13 @@
"use client"; "use client";
import { PieChart, Pie, Cell, ResponsiveContainer, Tooltip, Legend } from "recharts"; import {
PieChart,
Pie,
Cell,
ResponsiveContainer,
Tooltip,
Legend,
} from "recharts";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
interface DonutChartProps { interface DonutChartProps {
@ -22,9 +29,7 @@ const CustomTooltip = ({ active, payload }: any) => {
<div className="rounded-lg border bg-background p-3 shadow-md"> <div className="rounded-lg border bg-background p-3 shadow-md">
<p className="text-sm font-medium">{data.name}</p> <p className="text-sm font-medium">{data.name}</p>
<p className="text-sm text-muted-foreground"> <p className="text-sm text-muted-foreground">
<span className="font-medium text-foreground"> <span className="font-medium text-foreground">{data.value}</span>{" "}
{data.value}
</span>{" "}
sessions ({((data.value / data.payload.total) * 100).toFixed(1)}%) sessions ({((data.value / data.payload.total) * 100).toFixed(1)}%)
</p> </p>
</div> </div>
@ -77,7 +82,7 @@ export default function ModernDonutChart({
className, className,
}: DonutChartProps) { }: DonutChartProps) {
const total = data.reduce((sum, item) => sum + item.value, 0); const total = data.reduce((sum, item) => sum + item.value, 0);
const dataWithTotal = data.map(item => ({ ...item, total })); const dataWithTotal = data.map((item) => ({ ...item, total }));
return ( return (
<Card className={className}> <Card className={className}>

View File

@ -60,7 +60,10 @@ export default function ModernLineChart({
)} )}
<CardContent> <CardContent>
<ResponsiveContainer width="100%" height={height}> <ResponsiveContainer width="100%" height={height}>
<ChartComponent data={data} margin={{ top: 5, right: 30, left: 20, bottom: 5 }}> <ChartComponent
data={data}
margin={{ top: 5, right: 30, left: 20, bottom: 5 }}
>
<defs> <defs>
{gradient && ( {gradient && (
<linearGradient id="colorGradient" x1="0" y1="0" x2="0" y2="1"> <linearGradient id="colorGradient" x1="0" y1="0" x2="0" y2="1">

View File

@ -1,8 +1,8 @@
import * as React from "react" import * as React from "react";
import { Slot } from "@radix-ui/react-slot" import { Slot } from "@radix-ui/react-slot";
import { cva, type VariantProps } from "class-variance-authority" import { cva, type VariantProps } from "class-variance-authority";
import { cn } from "@/lib/utils" import { cn } from "@/lib/utils";
const badgeVariants = cva( const badgeVariants = cva(
"inline-flex items-center justify-center rounded-md border px-2 py-0.5 text-xs font-medium w-fit whitespace-nowrap shrink-0 [&>svg]:size-3 gap-1 [&>svg]:pointer-events-none focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px] aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive transition-[color,box-shadow] overflow-hidden", "inline-flex items-center justify-center rounded-md border px-2 py-0.5 text-xs font-medium w-fit whitespace-nowrap shrink-0 [&>svg]:size-3 gap-1 [&>svg]:pointer-events-none focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px] aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive transition-[color,box-shadow] overflow-hidden",
@ -23,7 +23,7 @@ const badgeVariants = cva(
variant: "default", variant: "default",
}, },
} }
) );
function Badge({ function Badge({
className, className,
@ -32,7 +32,7 @@ function Badge({
...props ...props
}: React.ComponentProps<"span"> & }: React.ComponentProps<"span"> &
VariantProps<typeof badgeVariants> & { asChild?: boolean }) { VariantProps<typeof badgeVariants> & { asChild?: boolean }) {
const Comp = asChild ? Slot : "span" const Comp = asChild ? Slot : "span";
return ( return (
<Comp <Comp
@ -40,7 +40,7 @@ function Badge({
className={cn(badgeVariants({ variant }), className)} className={cn(badgeVariants({ variant }), className)}
{...props} {...props}
/> />
) );
} }
export { Badge, badgeVariants } export { Badge, badgeVariants };

View File

@ -1,8 +1,8 @@
import * as React from "react" import * as React from "react";
import { Slot } from "@radix-ui/react-slot" import { Slot } from "@radix-ui/react-slot";
import { cva, type VariantProps } from "class-variance-authority" import { cva, type VariantProps } from "class-variance-authority";
import { cn } from "@/lib/utils" import { cn } from "@/lib/utils";
const buttonVariants = cva( const buttonVariants = cva(
"inline-flex items-center justify-center gap-2 whitespace-nowrap rounded-md text-sm font-medium transition-all disabled:pointer-events-none disabled:opacity-50 [&_svg]:pointer-events-none [&_svg:not([class*='size-'])]:size-4 shrink-0 [&_svg]:shrink-0 outline-none focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px] aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive", "inline-flex items-center justify-center gap-2 whitespace-nowrap rounded-md text-sm font-medium transition-all disabled:pointer-events-none disabled:opacity-50 [&_svg]:pointer-events-none [&_svg:not([class*='size-'])]:size-4 shrink-0 [&_svg]:shrink-0 outline-none focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px] aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive",
@ -33,7 +33,7 @@ const buttonVariants = cva(
size: "default", size: "default",
}, },
} }
) );
function Button({ function Button({
className, className,
@ -43,9 +43,9 @@ function Button({
...props ...props
}: React.ComponentProps<"button"> & }: React.ComponentProps<"button"> &
VariantProps<typeof buttonVariants> & { VariantProps<typeof buttonVariants> & {
asChild?: boolean asChild?: boolean;
}) { }) {
const Comp = asChild ? Slot : "button" const Comp = asChild ? Slot : "button";
return ( return (
<Comp <Comp
@ -53,7 +53,7 @@ function Button({
className={cn(buttonVariants({ variant, size, className }))} className={cn(buttonVariants({ variant, size, className }))}
{...props} {...props}
/> />
) );
} }
export { Button, buttonVariants } export { Button, buttonVariants };

View File

@ -1,6 +1,6 @@
import * as React from "react" import * as React from "react";
import { cn } from "@/lib/utils" import { cn } from "@/lib/utils";
function Card({ className, ...props }: React.ComponentProps<"div">) { function Card({ className, ...props }: React.ComponentProps<"div">) {
return ( return (
@ -12,7 +12,7 @@ function Card({ className, ...props }: React.ComponentProps<"div">) {
)} )}
{...props} {...props}
/> />
) );
} }
function CardHeader({ className, ...props }: React.ComponentProps<"div">) { function CardHeader({ className, ...props }: React.ComponentProps<"div">) {
@ -25,7 +25,7 @@ function CardHeader({ className, ...props }: React.ComponentProps<"div">) {
)} )}
{...props} {...props}
/> />
) );
} }
function CardTitle({ className, ...props }: React.ComponentProps<"div">) { function CardTitle({ className, ...props }: React.ComponentProps<"div">) {
@ -35,7 +35,7 @@ function CardTitle({ className, ...props }: React.ComponentProps<"div">) {
className={cn("leading-none font-semibold", className)} className={cn("leading-none font-semibold", className)}
{...props} {...props}
/> />
) );
} }
function CardDescription({ className, ...props }: React.ComponentProps<"div">) { function CardDescription({ className, ...props }: React.ComponentProps<"div">) {
@ -45,7 +45,7 @@ function CardDescription({ className, ...props }: React.ComponentProps<"div">) {
className={cn("text-muted-foreground text-sm", className)} className={cn("text-muted-foreground text-sm", className)}
{...props} {...props}
/> />
) );
} }
function CardAction({ className, ...props }: React.ComponentProps<"div">) { function CardAction({ className, ...props }: React.ComponentProps<"div">) {
@ -58,7 +58,7 @@ function CardAction({ className, ...props }: React.ComponentProps<"div">) {
)} )}
{...props} {...props}
/> />
) );
} }
function CardContent({ className, ...props }: React.ComponentProps<"div">) { function CardContent({ className, ...props }: React.ComponentProps<"div">) {
@ -68,7 +68,7 @@ function CardContent({ className, ...props }: React.ComponentProps<"div">) {
className={cn("px-6", className)} className={cn("px-6", className)}
{...props} {...props}
/> />
) );
} }
function CardFooter({ className, ...props }: React.ComponentProps<"div">) { function CardFooter({ className, ...props }: React.ComponentProps<"div">) {
@ -78,7 +78,7 @@ function CardFooter({ className, ...props }: React.ComponentProps<"div">) {
className={cn("flex items-center px-6 [.border-t]:pt-6", className)} className={cn("flex items-center px-6 [.border-t]:pt-6", className)}
{...props} {...props}
/> />
) );
} }
export { export {
@ -89,4 +89,4 @@ export {
CardAction, CardAction,
CardDescription, CardDescription,
CardContent, CardContent,
} };

View File

@ -1,15 +1,15 @@
"use client" "use client";
import * as React from "react" import * as React from "react";
import * as DropdownMenuPrimitive from "@radix-ui/react-dropdown-menu" import * as DropdownMenuPrimitive from "@radix-ui/react-dropdown-menu";
import { CheckIcon, ChevronRightIcon, CircleIcon } from "lucide-react" import { CheckIcon, ChevronRightIcon, CircleIcon } from "lucide-react";
import { cn } from "@/lib/utils" import { cn } from "@/lib/utils";
function DropdownMenu({ function DropdownMenu({
...props ...props
}: React.ComponentProps<typeof DropdownMenuPrimitive.Root>) { }: React.ComponentProps<typeof DropdownMenuPrimitive.Root>) {
return <DropdownMenuPrimitive.Root data-slot="dropdown-menu" {...props} /> return <DropdownMenuPrimitive.Root data-slot="dropdown-menu" {...props} />;
} }
function DropdownMenuPortal({ function DropdownMenuPortal({
@ -17,7 +17,7 @@ function DropdownMenuPortal({
}: React.ComponentProps<typeof DropdownMenuPrimitive.Portal>) { }: React.ComponentProps<typeof DropdownMenuPrimitive.Portal>) {
return ( return (
<DropdownMenuPrimitive.Portal data-slot="dropdown-menu-portal" {...props} /> <DropdownMenuPrimitive.Portal data-slot="dropdown-menu-portal" {...props} />
) );
} }
function DropdownMenuTrigger({ function DropdownMenuTrigger({
@ -28,7 +28,7 @@ function DropdownMenuTrigger({
data-slot="dropdown-menu-trigger" data-slot="dropdown-menu-trigger"
{...props} {...props}
/> />
) );
} }
function DropdownMenuContent({ function DropdownMenuContent({
@ -48,7 +48,7 @@ function DropdownMenuContent({
{...props} {...props}
/> />
</DropdownMenuPrimitive.Portal> </DropdownMenuPrimitive.Portal>
) );
} }
function DropdownMenuGroup({ function DropdownMenuGroup({
@ -56,7 +56,7 @@ function DropdownMenuGroup({
}: React.ComponentProps<typeof DropdownMenuPrimitive.Group>) { }: React.ComponentProps<typeof DropdownMenuPrimitive.Group>) {
return ( return (
<DropdownMenuPrimitive.Group data-slot="dropdown-menu-group" {...props} /> <DropdownMenuPrimitive.Group data-slot="dropdown-menu-group" {...props} />
) );
} }
function DropdownMenuItem({ function DropdownMenuItem({
@ -65,8 +65,8 @@ function DropdownMenuItem({
variant = "default", variant = "default",
...props ...props
}: React.ComponentProps<typeof DropdownMenuPrimitive.Item> & { }: React.ComponentProps<typeof DropdownMenuPrimitive.Item> & {
inset?: boolean inset?: boolean;
variant?: "default" | "destructive" variant?: "default" | "destructive";
}) { }) {
return ( return (
<DropdownMenuPrimitive.Item <DropdownMenuPrimitive.Item
@ -79,7 +79,7 @@ function DropdownMenuItem({
)} )}
{...props} {...props}
/> />
) );
} }
function DropdownMenuCheckboxItem({ function DropdownMenuCheckboxItem({
@ -105,7 +105,7 @@ function DropdownMenuCheckboxItem({
</span> </span>
{children} {children}
</DropdownMenuPrimitive.CheckboxItem> </DropdownMenuPrimitive.CheckboxItem>
) );
} }
function DropdownMenuRadioGroup({ function DropdownMenuRadioGroup({
@ -116,7 +116,7 @@ function DropdownMenuRadioGroup({
data-slot="dropdown-menu-radio-group" data-slot="dropdown-menu-radio-group"
{...props} {...props}
/> />
) );
} }
function DropdownMenuRadioItem({ function DropdownMenuRadioItem({
@ -140,7 +140,7 @@ function DropdownMenuRadioItem({
</span> </span>
{children} {children}
</DropdownMenuPrimitive.RadioItem> </DropdownMenuPrimitive.RadioItem>
) );
} }
function DropdownMenuLabel({ function DropdownMenuLabel({
@ -148,7 +148,7 @@ function DropdownMenuLabel({
inset, inset,
...props ...props
}: React.ComponentProps<typeof DropdownMenuPrimitive.Label> & { }: React.ComponentProps<typeof DropdownMenuPrimitive.Label> & {
inset?: boolean inset?: boolean;
}) { }) {
return ( return (
<DropdownMenuPrimitive.Label <DropdownMenuPrimitive.Label
@ -160,7 +160,7 @@ function DropdownMenuLabel({
)} )}
{...props} {...props}
/> />
) );
} }
function DropdownMenuSeparator({ function DropdownMenuSeparator({
@ -173,7 +173,7 @@ function DropdownMenuSeparator({
className={cn("bg-border -mx-1 my-1 h-px", className)} className={cn("bg-border -mx-1 my-1 h-px", className)}
{...props} {...props}
/> />
) );
} }
function DropdownMenuShortcut({ function DropdownMenuShortcut({
@ -189,13 +189,13 @@ function DropdownMenuShortcut({
)} )}
{...props} {...props}
/> />
) );
} }
function DropdownMenuSub({ function DropdownMenuSub({
...props ...props
}: React.ComponentProps<typeof DropdownMenuPrimitive.Sub>) { }: React.ComponentProps<typeof DropdownMenuPrimitive.Sub>) {
return <DropdownMenuPrimitive.Sub data-slot="dropdown-menu-sub" {...props} /> return <DropdownMenuPrimitive.Sub data-slot="dropdown-menu-sub" {...props} />;
} }
function DropdownMenuSubTrigger({ function DropdownMenuSubTrigger({
@ -204,7 +204,7 @@ function DropdownMenuSubTrigger({
children, children,
...props ...props
}: React.ComponentProps<typeof DropdownMenuPrimitive.SubTrigger> & { }: React.ComponentProps<typeof DropdownMenuPrimitive.SubTrigger> & {
inset?: boolean inset?: boolean;
}) { }) {
return ( return (
<DropdownMenuPrimitive.SubTrigger <DropdownMenuPrimitive.SubTrigger
@ -219,7 +219,7 @@ function DropdownMenuSubTrigger({
{children} {children}
<ChevronRightIcon className="ml-auto size-4" /> <ChevronRightIcon className="ml-auto size-4" />
</DropdownMenuPrimitive.SubTrigger> </DropdownMenuPrimitive.SubTrigger>
) );
} }
function DropdownMenuSubContent({ function DropdownMenuSubContent({
@ -235,7 +235,7 @@ function DropdownMenuSubContent({
)} )}
{...props} {...props}
/> />
) );
} }
export { export {
@ -254,4 +254,4 @@ export {
DropdownMenuSub, DropdownMenuSub,
DropdownMenuSubTrigger, DropdownMenuSubTrigger,
DropdownMenuSubContent, DropdownMenuSubContent,
} };

View File

@ -94,7 +94,9 @@ export default function MetricCard({
const getTrendColor = () => { const getTrendColor = () => {
if (!trend || trend.value === 0) return "text-muted-foreground"; if (!trend || trend.value === 0) return "text-muted-foreground";
return trend.isPositive !== false ? "text-green-600 dark:text-green-400" : "text-red-600 dark:text-red-400"; return trend.isPositive !== false
? "text-green-600 dark:text-green-400"
: "text-red-600 dark:text-red-400";
}; };
return ( return (
@ -115,9 +117,7 @@ export default function MetricCard({
{title} {title}
</p> </p>
{description && ( {description && (
<p className="text-xs text-muted-foreground/80"> <p className="text-xs text-muted-foreground/80">{description}</p>
{description}
</p>
)} )}
</div> </div>
@ -137,9 +137,7 @@ export default function MetricCard({
<CardContent className="relative"> <CardContent className="relative">
<div className="flex items-end justify-between"> <div className="flex items-end justify-between">
<div className="space-y-1"> <div className="space-y-1">
<p className="text-2xl font-bold tracking-tight"> <p className="text-2xl font-bold tracking-tight">{value ?? "—"}</p>
{value ?? "—"}
</p>
{trend && ( {trend && (
<Badge <Badge

View File

@ -1,9 +1,9 @@
"use client" "use client";
import * as React from "react" import * as React from "react";
import * as SeparatorPrimitive from "@radix-ui/react-separator" import * as SeparatorPrimitive from "@radix-ui/react-separator";
import { cn } from "@/lib/utils" import { cn } from "@/lib/utils";
function Separator({ function Separator({
className, className,
@ -22,7 +22,7 @@ function Separator({
)} )}
{...props} {...props}
/> />
) );
} }
export { Separator } export { Separator };

View File

@ -1,4 +1,4 @@
import { cn } from "@/lib/utils" import { cn } from "@/lib/utils";
function Skeleton({ className, ...props }: React.ComponentProps<"div">) { function Skeleton({ className, ...props }: React.ComponentProps<"div">) {
return ( return (
@ -7,7 +7,7 @@ function Skeleton({ className, ...props }: React.ComponentProps<"div">) {
className={cn("bg-accent animate-pulse rounded-md", className)} className={cn("bg-accent animate-pulse rounded-md", className)}
{...props} {...props}
/> />
) );
} }
export { Skeleton } export { Skeleton };

View File

@ -1,9 +1,9 @@
"use client" "use client";
import * as React from "react" import * as React from "react";
import * as TooltipPrimitive from "@radix-ui/react-tooltip" import * as TooltipPrimitive from "@radix-ui/react-tooltip";
import { cn } from "@/lib/utils" import { cn } from "@/lib/utils";
function TooltipProvider({ function TooltipProvider({
delayDuration = 0, delayDuration = 0,
@ -15,7 +15,7 @@ function TooltipProvider({
delayDuration={delayDuration} delayDuration={delayDuration}
{...props} {...props}
/> />
) );
} }
function Tooltip({ function Tooltip({
@ -25,13 +25,13 @@ function Tooltip({
<TooltipProvider> <TooltipProvider>
<TooltipPrimitive.Root data-slot="tooltip" {...props} /> <TooltipPrimitive.Root data-slot="tooltip" {...props} />
</TooltipProvider> </TooltipProvider>
) );
} }
function TooltipTrigger({ function TooltipTrigger({
...props ...props
}: React.ComponentProps<typeof TooltipPrimitive.Trigger>) { }: React.ComponentProps<typeof TooltipPrimitive.Trigger>) {
return <TooltipPrimitive.Trigger data-slot="tooltip-trigger" {...props} /> return <TooltipPrimitive.Trigger data-slot="tooltip-trigger" {...props} />;
} }
function TooltipContent({ function TooltipContent({
@ -55,7 +55,7 @@ function TooltipContent({
<TooltipPrimitive.Arrow className="bg-primary fill-primary z-50 size-2.5 translate-y-[calc(-50%-2px)] rotate-45 rounded-[2px]" /> <TooltipPrimitive.Arrow className="bg-primary fill-primary z-50 size-2.5 translate-y-[calc(-50%-2px)] rotate-45 rounded-[2px]" />
</TooltipPrimitive.Content> </TooltipPrimitive.Content>
</TooltipPrimitive.Portal> </TooltipPrimitive.Portal>
) );
} }
export { Tooltip, TooltipTrigger, TooltipContent, TooltipProvider } export { Tooltip, TooltipTrigger, TooltipContent, TooltipProvider };

View File

@ -1,11 +1,11 @@
import { PrismaClient } from '@prisma/client'; import { PrismaClient } from "@prisma/client";
import { ProcessingStatusManager } from './lib/processingStatusManager'; import { ProcessingStatusManager } from "./lib/processingStatusManager";
const prisma = new PrismaClient(); const prisma = new PrismaClient();
async function debugImportStatus() { async function debugImportStatus() {
try { try {
console.log('=== DEBUGGING PROCESSING STATUS (REFACTORED SYSTEM) ===\n'); console.log("=== DEBUGGING PROCESSING STATUS (REFACTORED SYSTEM) ===\n");
// Get pipeline status using the new system // Get pipeline status using the new system
const pipelineStatus = await ProcessingStatusManager.getPipelineStatus(); const pipelineStatus = await ProcessingStatusManager.getPipelineStatus();
@ -13,7 +13,13 @@ async function debugImportStatus() {
console.log(`Total Sessions: ${pipelineStatus.totalSessions}\n`); console.log(`Total Sessions: ${pipelineStatus.totalSessions}\n`);
// Display status for each stage // Display status for each stage
const stages = ['CSV_IMPORT', 'TRANSCRIPT_FETCH', 'SESSION_CREATION', 'AI_ANALYSIS', 'QUESTION_EXTRACTION']; const stages = [
"CSV_IMPORT",
"TRANSCRIPT_FETCH",
"SESSION_CREATION",
"AI_ANALYSIS",
"QUESTION_EXTRACTION",
];
for (const stage of stages) { for (const stage of stages) {
console.log(`${stage}:`); console.log(`${stage}:`);
@ -30,13 +36,13 @@ async function debugImportStatus() {
console.log(` COMPLETED: ${completed}`); console.log(` COMPLETED: ${completed}`);
console.log(` FAILED: ${failed}`); console.log(` FAILED: ${failed}`);
console.log(` SKIPPED: ${skipped}`); console.log(` SKIPPED: ${skipped}`);
console.log(''); console.log("");
} }
// Check Sessions vs SessionImports // Check Sessions vs SessionImports
console.log('=== SESSION IMPORT RELATIONSHIP ==='); console.log("=== SESSION IMPORT RELATIONSHIP ===");
const sessionsWithImports = await prisma.session.count({ const sessionsWithImports = await prisma.session.count({
where: { importId: { not: null } } where: { importId: { not: null } },
}); });
const totalSessions = await prisma.session.count(); const totalSessions = await prisma.session.count();
@ -46,20 +52,24 @@ async function debugImportStatus() {
// Show failed sessions if any // Show failed sessions if any
const failedSessions = await ProcessingStatusManager.getFailedSessions(); const failedSessions = await ProcessingStatusManager.getFailedSessions();
if (failedSessions.length > 0) { if (failedSessions.length > 0) {
console.log('\n=== FAILED SESSIONS ==='); console.log("\n=== FAILED SESSIONS ===");
failedSessions.slice(0, 10).forEach(failure => { failedSessions.slice(0, 10).forEach((failure) => {
console.log(` ${failure.session.import?.externalSessionId || failure.sessionId}: ${failure.stage} - ${failure.errorMessage}`); console.log(
` ${failure.session.import?.externalSessionId || failure.sessionId}: ${failure.stage} - ${failure.errorMessage}`
);
}); });
if (failedSessions.length > 10) { if (failedSessions.length > 10) {
console.log(` ... and ${failedSessions.length - 10} more failed sessions`); console.log(
` ... and ${failedSessions.length - 10} more failed sessions`
);
} }
} else { } else {
console.log('\n✓ No failed sessions found'); console.log("\n✓ No failed sessions found");
} }
// Show what needs processing // Show what needs processing
console.log('\n=== WHAT NEEDS PROCESSING ==='); console.log("\n=== WHAT NEEDS PROCESSING ===");
for (const stage of stages) { for (const stage of stages) {
const stageData = pipelineStatus.pipeline[stage] || {}; const stageData = pipelineStatus.pipeline[stage] || {};
@ -70,9 +80,8 @@ async function debugImportStatus() {
console.log(`${stage}: ${pending} pending, ${failed} failed`); console.log(`${stage}: ${pending} pending, ${failed} failed`);
} }
} }
} catch (error) { } catch (error) {
console.error('Error debugging processing status:', error); console.error("Error debugging processing status:", error);
} finally { } finally {
await prisma.$disconnect(); await prisma.$disconnect();
} }

View File

@ -51,13 +51,14 @@ After cleanup, only valid companies remain:
1. **lib/csvFetcher.js** 1. **lib/csvFetcher.js**
- Added company URL validation - Added company URL validation
- Improved transcript fetching error handling - Improved transcript fetching error handling
- Reduced error log verbosity - Reduced error log verbosity
2. **fix_companies.js** (cleanup script) 2. **fix_companies.js** (cleanup script)
- Removes invalid company records
- Can be run again if needed - Removes invalid company records
- Can be run again if needed
## Monitoring ## Monitoring

View File

@ -1,11 +1,15 @@
import { PrismaClient, ProcessingStage, ProcessingStatus } from '@prisma/client'; import {
import { ProcessingStatusManager } from './lib/processingStatusManager'; PrismaClient,
ProcessingStage,
ProcessingStatus,
} from "@prisma/client";
import { ProcessingStatusManager } from "./lib/processingStatusManager";
const prisma = new PrismaClient(); const prisma = new PrismaClient();
async function fixProcessingStatus() { async function fixProcessingStatus() {
try { try {
console.log('=== FIXING PROCESSING STATUS (REFACTORED SYSTEM) ===\n'); console.log("=== FIXING PROCESSING STATUS (REFACTORED SYSTEM) ===\n");
// Check for any failed processing stages that might need retry // Check for any failed processing stages that might need retry
const failedSessions = await ProcessingStatusManager.getFailedSessions(); const failedSessions = await ProcessingStatusManager.getFailedSessions();
@ -13,11 +17,12 @@ async function fixProcessingStatus() {
console.log(`Found ${failedSessions.length} failed processing stages`); console.log(`Found ${failedSessions.length} failed processing stages`);
if (failedSessions.length > 0) { if (failedSessions.length > 0) {
console.log('\nFailed sessions by stage:'); console.log("\nFailed sessions by stage:");
const failuresByStage: Record<string, number> = {}; const failuresByStage: Record<string, number> = {};
failedSessions.forEach(failure => { failedSessions.forEach((failure) => {
failuresByStage[failure.stage] = (failuresByStage[failure.stage] || 0) + 1; failuresByStage[failure.stage] =
(failuresByStage[failure.stage] || 0) + 1;
}); });
Object.entries(failuresByStage).forEach(([stage, count]) => { Object.entries(failuresByStage).forEach(([stage, count]) => {
@ -25,14 +30,18 @@ async function fixProcessingStatus() {
}); });
// Show sample failed sessions // Show sample failed sessions
console.log('\nSample failed sessions:'); console.log("\nSample failed sessions:");
failedSessions.slice(0, 5).forEach(failure => { failedSessions.slice(0, 5).forEach((failure) => {
console.log(` ${failure.session.import?.externalSessionId || failure.sessionId}: ${failure.stage} - ${failure.errorMessage}`); console.log(
` ${failure.session.import?.externalSessionId || failure.sessionId}: ${failure.stage} - ${failure.errorMessage}`
);
}); });
// Ask if user wants to reset failed stages for retry // Ask if user wants to reset failed stages for retry
console.log('\nTo reset failed stages for retry, you can use:'); console.log("\nTo reset failed stages for retry, you can use:");
console.log('ProcessingStatusManager.resetStageForRetry(sessionId, stage)'); console.log(
"ProcessingStatusManager.resetStageForRetry(sessionId, stage)"
);
} }
// Check for sessions that might be stuck in IN_PROGRESS // Check for sessions that might be stuck in IN_PROGRESS
@ -40,32 +49,44 @@ async function fixProcessingStatus() {
where: { where: {
status: ProcessingStatus.IN_PROGRESS, status: ProcessingStatus.IN_PROGRESS,
startedAt: { startedAt: {
lt: new Date(Date.now() - 30 * 60 * 1000) // Started more than 30 minutes ago lt: new Date(Date.now() - 30 * 60 * 1000), // Started more than 30 minutes ago
} },
}, },
include: { include: {
session: { session: {
include: { include: {
import: true import: true,
} },
} },
} },
}); });
if (stuckSessions.length > 0) { if (stuckSessions.length > 0) {
console.log(`\nFound ${stuckSessions.length} sessions stuck in IN_PROGRESS state:`); console.log(
stuckSessions.forEach(stuck => { `\nFound ${stuckSessions.length} sessions stuck in IN_PROGRESS state:`
console.log(` ${stuck.session.import?.externalSessionId || stuck.sessionId}: ${stuck.stage} (started: ${stuck.startedAt})`); );
stuckSessions.forEach((stuck) => {
console.log(
` ${stuck.session.import?.externalSessionId || stuck.sessionId}: ${stuck.stage} (started: ${stuck.startedAt})`
);
}); });
console.log('\nThese sessions may need to be reset to PENDING status for retry.'); console.log(
"\nThese sessions may need to be reset to PENDING status for retry."
);
} }
// Show current pipeline status // Show current pipeline status
console.log('\n=== CURRENT PIPELINE STATUS ==='); console.log("\n=== CURRENT PIPELINE STATUS ===");
const pipelineStatus = await ProcessingStatusManager.getPipelineStatus(); const pipelineStatus = await ProcessingStatusManager.getPipelineStatus();
const stages = ['CSV_IMPORT', 'TRANSCRIPT_FETCH', 'SESSION_CREATION', 'AI_ANALYSIS', 'QUESTION_EXTRACTION']; const stages = [
"CSV_IMPORT",
"TRANSCRIPT_FETCH",
"SESSION_CREATION",
"AI_ANALYSIS",
"QUESTION_EXTRACTION",
];
for (const stage of stages) { for (const stage of stages) {
const stageData = pipelineStatus.pipeline[stage] || {}; const stageData = pipelineStatus.pipeline[stage] || {};
@ -75,11 +96,12 @@ async function fixProcessingStatus() {
const failed = stageData.FAILED || 0; const failed = stageData.FAILED || 0;
const skipped = stageData.SKIPPED || 0; const skipped = stageData.SKIPPED || 0;
console.log(`${stage}: ${completed} completed, ${pending} pending, ${inProgress} in progress, ${failed} failed, ${skipped} skipped`); console.log(
`${stage}: ${completed} completed, ${pending} pending, ${inProgress} in progress, ${failed} failed, ${skipped} skipped`
);
} }
} catch (error) { } catch (error) {
console.error('Error fixing processing status:', error); console.error("Error fixing processing status:", error);
} finally { } finally {
await prisma.$disconnect(); await prisma.$disconnect();
} }

View File

@ -7,20 +7,22 @@ import { dirname, join } from "path";
* Parse environment variable value by removing quotes, comments, and trimming whitespace * Parse environment variable value by removing quotes, comments, and trimming whitespace
*/ */
function parseEnvValue(value: string | undefined): string { function parseEnvValue(value: string | undefined): string {
if (!value) return ''; if (!value) return "";
// Trim whitespace // Trim whitespace
let cleaned = value.trim(); let cleaned = value.trim();
// Remove inline comments (everything after #) // Remove inline comments (everything after #)
const commentIndex = cleaned.indexOf('#'); const commentIndex = cleaned.indexOf("#");
if (commentIndex !== -1) { if (commentIndex !== -1) {
cleaned = cleaned.substring(0, commentIndex).trim(); cleaned = cleaned.substring(0, commentIndex).trim();
} }
// Remove surrounding quotes (both single and double) // Remove surrounding quotes (both single and double)
if ((cleaned.startsWith('"') && cleaned.endsWith('"')) || if (
(cleaned.startsWith("'") && cleaned.endsWith("'"))) { (cleaned.startsWith('"') && cleaned.endsWith('"')) ||
(cleaned.startsWith("'") && cleaned.endsWith("'"))
) {
cleaned = cleaned.slice(1, -1); cleaned = cleaned.slice(1, -1);
} }
@ -30,7 +32,10 @@ function parseEnvValue(value: string | undefined): string {
/** /**
* Parse integer with fallback to default value * Parse integer with fallback to default value
*/ */
function parseIntWithDefault(value: string | undefined, defaultValue: number): number { function parseIntWithDefault(
value: string | undefined,
defaultValue: number
): number {
const cleaned = parseEnvValue(value); const cleaned = parseEnvValue(value);
if (!cleaned) return defaultValue; if (!cleaned) return defaultValue;
@ -41,17 +46,19 @@ function parseIntWithDefault(value: string | undefined, defaultValue: number): n
// Load environment variables from .env.local // Load environment variables from .env.local
const __filename = fileURLToPath(import.meta.url); const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename); const __dirname = dirname(__filename);
const envPath = join(__dirname, '..', '.env.local'); const envPath = join(__dirname, "..", ".env.local");
// Load .env.local if it exists // Load .env.local if it exists
try { try {
const envFile = readFileSync(envPath, 'utf8'); const envFile = readFileSync(envPath, "utf8");
const envVars = envFile.split('\n').filter(line => line.trim() && !line.startsWith('#')); const envVars = envFile
.split("\n")
.filter((line) => line.trim() && !line.startsWith("#"));
envVars.forEach(line => { envVars.forEach((line) => {
const [key, ...valueParts] = line.split('='); const [key, ...valueParts] = line.split("=");
if (key && valueParts.length > 0) { if (key && valueParts.length > 0) {
const rawValue = valueParts.join('='); const rawValue = valueParts.join("=");
const cleanedValue = parseEnvValue(rawValue); const cleanedValue = parseEnvValue(rawValue);
if (!process.env[key.trim()]) { if (!process.env[key.trim()]) {
process.env[key.trim()] = cleanedValue; process.env[key.trim()] = cleanedValue;
@ -67,21 +74,34 @@ try {
*/ */
export const env = { export const env = {
// NextAuth // NextAuth
NEXTAUTH_URL: parseEnvValue(process.env.NEXTAUTH_URL) || 'http://localhost:3000', NEXTAUTH_URL:
NEXTAUTH_SECRET: parseEnvValue(process.env.NEXTAUTH_SECRET) || '', parseEnvValue(process.env.NEXTAUTH_URL) || "http://localhost:3000",
NODE_ENV: parseEnvValue(process.env.NODE_ENV) || 'development', NEXTAUTH_SECRET: parseEnvValue(process.env.NEXTAUTH_SECRET) || "",
NODE_ENV: parseEnvValue(process.env.NODE_ENV) || "development",
// OpenAI // OpenAI
OPENAI_API_KEY: parseEnvValue(process.env.OPENAI_API_KEY) || '', OPENAI_API_KEY: parseEnvValue(process.env.OPENAI_API_KEY) || "",
// Scheduler Configuration // Scheduler Configuration
SCHEDULER_ENABLED: parseEnvValue(process.env.SCHEDULER_ENABLED) === 'true', SCHEDULER_ENABLED: parseEnvValue(process.env.SCHEDULER_ENABLED) === "true",
CSV_IMPORT_INTERVAL: parseEnvValue(process.env.CSV_IMPORT_INTERVAL) || '*/15 * * * *', CSV_IMPORT_INTERVAL:
IMPORT_PROCESSING_INTERVAL: parseEnvValue(process.env.IMPORT_PROCESSING_INTERVAL) || '*/5 * * * *', parseEnvValue(process.env.CSV_IMPORT_INTERVAL) || "*/15 * * * *",
IMPORT_PROCESSING_BATCH_SIZE: parseIntWithDefault(process.env.IMPORT_PROCESSING_BATCH_SIZE, 50), IMPORT_PROCESSING_INTERVAL:
SESSION_PROCESSING_INTERVAL: parseEnvValue(process.env.SESSION_PROCESSING_INTERVAL) || '0 * * * *', parseEnvValue(process.env.IMPORT_PROCESSING_INTERVAL) || "*/5 * * * *",
SESSION_PROCESSING_BATCH_SIZE: parseIntWithDefault(process.env.SESSION_PROCESSING_BATCH_SIZE, 0), IMPORT_PROCESSING_BATCH_SIZE: parseIntWithDefault(
SESSION_PROCESSING_CONCURRENCY: parseIntWithDefault(process.env.SESSION_PROCESSING_CONCURRENCY, 5), process.env.IMPORT_PROCESSING_BATCH_SIZE,
50
),
SESSION_PROCESSING_INTERVAL:
parseEnvValue(process.env.SESSION_PROCESSING_INTERVAL) || "0 * * * *",
SESSION_PROCESSING_BATCH_SIZE: parseIntWithDefault(
process.env.SESSION_PROCESSING_BATCH_SIZE,
0
),
SESSION_PROCESSING_CONCURRENCY: parseIntWithDefault(
process.env.SESSION_PROCESSING_CONCURRENCY,
5
),
// Server // Server
PORT: parseIntWithDefault(process.env.PORT, 3000), PORT: parseIntWithDefault(process.env.PORT, 3000),
@ -94,11 +114,11 @@ export function validateEnv(): { valid: boolean; errors: string[] } {
const errors: string[] = []; const errors: string[] = [];
if (!env.NEXTAUTH_SECRET) { if (!env.NEXTAUTH_SECRET) {
errors.push('NEXTAUTH_SECRET is required'); errors.push("NEXTAUTH_SECRET is required");
} }
if (!env.OPENAI_API_KEY && env.NODE_ENV === 'production') { if (!env.OPENAI_API_KEY && env.NODE_ENV === "production") {
errors.push('OPENAI_API_KEY is required in production'); errors.push("OPENAI_API_KEY is required in production");
} }
return { return {
@ -132,14 +152,14 @@ export function getSchedulerConfig() {
* Log environment configuration (safe for production) * Log environment configuration (safe for production)
*/ */
export function logEnvConfig(): void { export function logEnvConfig(): void {
console.log('[Environment] Configuration:'); console.log("[Environment] Configuration:");
console.log(` NODE_ENV: ${env.NODE_ENV}`); console.log(` NODE_ENV: ${env.NODE_ENV}`);
console.log(` NEXTAUTH_URL: ${env.NEXTAUTH_URL}`); console.log(` NEXTAUTH_URL: ${env.NEXTAUTH_URL}`);
console.log(` SCHEDULER_ENABLED: ${env.SCHEDULER_ENABLED}`); console.log(` SCHEDULER_ENABLED: ${env.SCHEDULER_ENABLED}`);
console.log(` PORT: ${env.PORT}`); console.log(` PORT: ${env.PORT}`);
if (env.SCHEDULER_ENABLED) { if (env.SCHEDULER_ENABLED) {
console.log(' Scheduler intervals:'); console.log(" Scheduler intervals:");
console.log(` CSV Import: ${env.CSV_IMPORT_INTERVAL}`); console.log(` CSV Import: ${env.CSV_IMPORT_INTERVAL}`);
console.log(` Import Processing: ${env.IMPORT_PROCESSING_INTERVAL}`); console.log(` Import Processing: ${env.IMPORT_PROCESSING_INTERVAL}`);
console.log(` Session Processing: ${env.SESSION_PROCESSING_INTERVAL}`); console.log(` Session Processing: ${env.SESSION_PROCESSING_INTERVAL}`);

View File

@ -1,7 +1,15 @@
// SessionImport to Session processor // SessionImport to Session processor
import { PrismaClient, SentimentCategory, SessionCategory, ProcessingStage } from "@prisma/client"; import {
PrismaClient,
SentimentCategory,
SessionCategory,
ProcessingStage,
} from "@prisma/client";
import { getSchedulerConfig } from "./env"; import { getSchedulerConfig } from "./env";
import { fetchTranscriptContent, isValidTranscriptUrl } from "./transcriptFetcher"; import {
fetchTranscriptContent,
isValidTranscriptUrl,
} from "./transcriptFetcher";
import { ProcessingStatusManager } from "./processingStatusManager"; import { ProcessingStatusManager } from "./processingStatusManager";
import cron from "node-cron"; import cron from "node-cron";
@ -11,25 +19,29 @@ const prisma = new PrismaClient();
* Parse European date format (DD.MM.YYYY HH:mm:ss) to JavaScript Date * Parse European date format (DD.MM.YYYY HH:mm:ss) to JavaScript Date
*/ */
function parseEuropeanDate(dateStr: string): Date { function parseEuropeanDate(dateStr: string): Date {
if (!dateStr || typeof dateStr !== 'string') { if (!dateStr || typeof dateStr !== "string") {
throw new Error(`Invalid date string: ${dateStr}`); throw new Error(`Invalid date string: ${dateStr}`);
} }
// Handle format: "DD.MM.YYYY HH:mm:ss" // Handle format: "DD.MM.YYYY HH:mm:ss"
const [datePart, timePart] = dateStr.trim().split(' '); const [datePart, timePart] = dateStr.trim().split(" ");
if (!datePart || !timePart) { if (!datePart || !timePart) {
throw new Error(`Invalid date format: ${dateStr}. Expected format: DD.MM.YYYY HH:mm:ss`); throw new Error(
`Invalid date format: ${dateStr}. Expected format: DD.MM.YYYY HH:mm:ss`
);
} }
const [day, month, year] = datePart.split('.'); const [day, month, year] = datePart.split(".");
if (!day || !month || !year) { if (!day || !month || !year) {
throw new Error(`Invalid date part: ${datePart}. Expected format: DD.MM.YYYY`); throw new Error(
`Invalid date part: ${datePart}. Expected format: DD.MM.YYYY`
);
} }
// Convert to ISO format: YYYY-MM-DD HH:mm:ss // Convert to ISO format: YYYY-MM-DD HH:mm:ss
const isoDateStr = `${year}-${month.padStart(2, '0')}-${day.padStart(2, '0')} ${timePart}`; const isoDateStr = `${year}-${month.padStart(2, "0")}-${day.padStart(2, "0")} ${timePart}`;
const date = new Date(isoDateStr); const date = new Date(isoDateStr);
if (isNaN(date.getTime())) { if (isNaN(date.getTime())) {
@ -42,13 +54,15 @@ function parseEuropeanDate(dateStr: string): Date {
/** /**
* Helper function to parse sentiment from raw string (fallback only) * Helper function to parse sentiment from raw string (fallback only)
*/ */
function parseFallbackSentiment(sentimentRaw: string | null): SentimentCategory | null { function parseFallbackSentiment(
sentimentRaw: string | null
): SentimentCategory | null {
if (!sentimentRaw) return null; if (!sentimentRaw) return null;
const sentimentStr = sentimentRaw.toLowerCase(); const sentimentStr = sentimentRaw.toLowerCase();
if (sentimentStr.includes('positive')) { if (sentimentStr.includes("positive")) {
return SentimentCategory.POSITIVE; return SentimentCategory.POSITIVE;
} else if (sentimentStr.includes('negative')) { } else if (sentimentStr.includes("negative")) {
return SentimentCategory.NEGATIVE; return SentimentCategory.NEGATIVE;
} else { } else {
return SentimentCategory.NEUTRAL; return SentimentCategory.NEUTRAL;
@ -60,20 +74,25 @@ function parseFallbackSentiment(sentimentRaw: string | null): SentimentCategory
*/ */
function parseFallbackBoolean(rawValue: string | null): boolean | null { function parseFallbackBoolean(rawValue: string | null): boolean | null {
if (!rawValue) return null; if (!rawValue) return null;
return ['true', '1', 'yes', 'escalated', 'forwarded'].includes(rawValue.toLowerCase()); return ["true", "1", "yes", "escalated", "forwarded"].includes(
rawValue.toLowerCase()
);
} }
/** /**
* Parse transcript content into Message records * Parse transcript content into Message records
*/ */
async function parseTranscriptIntoMessages(sessionId: string, transcriptContent: string): Promise<void> { async function parseTranscriptIntoMessages(
sessionId: string,
transcriptContent: string
): Promise<void> {
// Clear existing messages for this session // Clear existing messages for this session
await prisma.message.deleteMany({ await prisma.message.deleteMany({
where: { sessionId } where: { sessionId },
}); });
// Split transcript into lines and parse each message // Split transcript into lines and parse each message
const lines = transcriptContent.split('\n').filter(line => line.trim()); const lines = transcriptContent.split("\n").filter((line) => line.trim());
let order = 0; let order = 0;
for (const line of lines) { for (const line of lines) {
@ -84,7 +103,7 @@ async function parseTranscriptIntoMessages(sessionId: string, transcriptContent:
// Format 1: "User: message" or "Assistant: message" // Format 1: "User: message" or "Assistant: message"
// Format 2: "[timestamp] User: message" or "[timestamp] Assistant: message" // Format 2: "[timestamp] User: message" or "[timestamp] Assistant: message"
let role = 'unknown'; let role = "unknown";
let content = trimmedLine; let content = trimmedLine;
let timestamp: Date | null = null; let timestamp: Date | null = null;
@ -107,7 +126,7 @@ async function parseTranscriptIntoMessages(sessionId: string, transcriptContent:
content = roleMatch[2].trim(); content = roleMatch[2].trim();
} else { } else {
// If no role prefix found, try to infer from context or use 'unknown' // If no role prefix found, try to infer from context or use 'unknown'
role = 'unknown'; role = "unknown";
} }
// Skip empty content // Skip empty content
@ -127,14 +146,18 @@ async function parseTranscriptIntoMessages(sessionId: string, transcriptContent:
order++; order++;
} }
console.log(`[Import Processor] ✓ Parsed ${order} messages for session ${sessionId}`); console.log(
`[Import Processor] ✓ Parsed ${order} messages for session ${sessionId}`
);
} }
/** /**
* Process a single SessionImport record into a Session record * Process a single SessionImport record into a Session record
* Uses new unified processing status tracking * Uses new unified processing status tracking
*/ */
async function processSingleImport(importRecord: any): Promise<{ success: boolean; error?: string }> { async function processSingleImport(
importRecord: any
): Promise<{ success: boolean; error?: string }> {
let sessionId: string | null = null; let sessionId: string | null = null;
try { try {
@ -142,7 +165,9 @@ async function processSingleImport(importRecord: any): Promise<{ success: boolea
const startTime = parseEuropeanDate(importRecord.startTimeRaw); const startTime = parseEuropeanDate(importRecord.startTimeRaw);
const endTime = parseEuropeanDate(importRecord.endTimeRaw); const endTime = parseEuropeanDate(importRecord.endTimeRaw);
console.log(`[Import Processor] Processing ${importRecord.externalSessionId}: ${startTime.toISOString()} - ${endTime.toISOString()}`); console.log(
`[Import Processor] Processing ${importRecord.externalSessionId}: ${startTime.toISOString()} - ${endTime.toISOString()}`
);
// Create or update Session record with MINIMAL processing // Create or update Session record with MINIMAL processing
const session = await prisma.session.upsert({ const session = await prisma.session.upsert({
@ -179,15 +204,27 @@ async function processSingleImport(importRecord: any): Promise<{ success: boolea
await ProcessingStatusManager.initializeSession(sessionId); await ProcessingStatusManager.initializeSession(sessionId);
// Mark CSV_IMPORT as completed // Mark CSV_IMPORT as completed
await ProcessingStatusManager.completeStage(sessionId, ProcessingStage.CSV_IMPORT); await ProcessingStatusManager.completeStage(
sessionId,
ProcessingStage.CSV_IMPORT
);
// Handle transcript fetching // Handle transcript fetching
let transcriptContent = importRecord.rawTranscriptContent; let transcriptContent = importRecord.rawTranscriptContent;
if (!transcriptContent && importRecord.fullTranscriptUrl && isValidTranscriptUrl(importRecord.fullTranscriptUrl)) { if (
await ProcessingStatusManager.startStage(sessionId, ProcessingStage.TRANSCRIPT_FETCH); !transcriptContent &&
importRecord.fullTranscriptUrl &&
isValidTranscriptUrl(importRecord.fullTranscriptUrl)
) {
await ProcessingStatusManager.startStage(
sessionId,
ProcessingStage.TRANSCRIPT_FETCH
);
console.log(`[Import Processor] Fetching transcript for ${importRecord.externalSessionId}...`); console.log(
`[Import Processor] Fetching transcript for ${importRecord.externalSessionId}...`
);
// Get company credentials for transcript fetching // Get company credentials for transcript fetching
const company = await prisma.company.findUnique({ const company = await prisma.company.findUnique({
@ -203,7 +240,9 @@ async function processSingleImport(importRecord: any): Promise<{ success: boolea
if (transcriptResult.success) { if (transcriptResult.success) {
transcriptContent = transcriptResult.content; transcriptContent = transcriptResult.content;
console.log(`[Import Processor] ✓ Fetched transcript for ${importRecord.externalSessionId} (${transcriptContent?.length} chars)`); console.log(
`[Import Processor] ✓ Fetched transcript for ${importRecord.externalSessionId} (${transcriptContent?.length} chars)`
);
// Update the import record with the fetched content // Update the import record with the fetched content
await prisma.sessionImport.update({ await prisma.sessionImport.update({
@ -211,36 +250,61 @@ async function processSingleImport(importRecord: any): Promise<{ success: boolea
data: { rawTranscriptContent: transcriptContent }, data: { rawTranscriptContent: transcriptContent },
}); });
await ProcessingStatusManager.completeStage(sessionId, ProcessingStage.TRANSCRIPT_FETCH, { await ProcessingStatusManager.completeStage(
sessionId,
ProcessingStage.TRANSCRIPT_FETCH,
{
contentLength: transcriptContent?.length || 0, contentLength: transcriptContent?.length || 0,
url: importRecord.fullTranscriptUrl url: importRecord.fullTranscriptUrl,
}); }
);
} else { } else {
console.log(`[Import Processor] ⚠️ Failed to fetch transcript for ${importRecord.externalSessionId}: ${transcriptResult.error}`); console.log(
await ProcessingStatusManager.failStage(sessionId, ProcessingStage.TRANSCRIPT_FETCH, transcriptResult.error || 'Unknown error'); `[Import Processor] ⚠️ Failed to fetch transcript for ${importRecord.externalSessionId}: ${transcriptResult.error}`
);
await ProcessingStatusManager.failStage(
sessionId,
ProcessingStage.TRANSCRIPT_FETCH,
transcriptResult.error || "Unknown error"
);
} }
} else if (!importRecord.fullTranscriptUrl) { } else if (!importRecord.fullTranscriptUrl) {
// No transcript URL available - skip this stage // No transcript URL available - skip this stage
await ProcessingStatusManager.skipStage(sessionId, ProcessingStage.TRANSCRIPT_FETCH, 'No transcript URL provided'); await ProcessingStatusManager.skipStage(
sessionId,
ProcessingStage.TRANSCRIPT_FETCH,
"No transcript URL provided"
);
} else { } else {
// Transcript already fetched // Transcript already fetched
await ProcessingStatusManager.completeStage(sessionId, ProcessingStage.TRANSCRIPT_FETCH, { await ProcessingStatusManager.completeStage(
sessionId,
ProcessingStage.TRANSCRIPT_FETCH,
{
contentLength: transcriptContent?.length || 0, contentLength: transcriptContent?.length || 0,
source: 'already_fetched' source: "already_fetched",
}); }
);
} }
// Handle session creation (parse messages) // Handle session creation (parse messages)
await ProcessingStatusManager.startStage(sessionId, ProcessingStage.SESSION_CREATION); await ProcessingStatusManager.startStage(
sessionId,
ProcessingStage.SESSION_CREATION
);
if (transcriptContent) { if (transcriptContent) {
await parseTranscriptIntoMessages(sessionId, transcriptContent); await parseTranscriptIntoMessages(sessionId, transcriptContent);
} }
await ProcessingStatusManager.completeStage(sessionId, ProcessingStage.SESSION_CREATION, { await ProcessingStatusManager.completeStage(
sessionId,
ProcessingStage.SESSION_CREATION,
{
hasTranscript: !!transcriptContent, hasTranscript: !!transcriptContent,
transcriptLength: transcriptContent?.length || 0 transcriptLength: transcriptContent?.length || 0,
}); }
);
return { success: true }; return { success: true };
} catch (error) { } catch (error) {
@ -249,13 +313,31 @@ async function processSingleImport(importRecord: any): Promise<{ success: boolea
// Mark the current stage as failed if we have a sessionId // Mark the current stage as failed if we have a sessionId
if (sessionId) { if (sessionId) {
// Determine which stage failed based on the error // Determine which stage failed based on the error
if (errorMessage.includes('transcript') || errorMessage.includes('fetch')) { if (
await ProcessingStatusManager.failStage(sessionId, ProcessingStage.TRANSCRIPT_FETCH, errorMessage); errorMessage.includes("transcript") ||
} else if (errorMessage.includes('message') || errorMessage.includes('parse')) { errorMessage.includes("fetch")
await ProcessingStatusManager.failStage(sessionId, ProcessingStage.SESSION_CREATION, errorMessage); ) {
await ProcessingStatusManager.failStage(
sessionId,
ProcessingStage.TRANSCRIPT_FETCH,
errorMessage
);
} else if (
errorMessage.includes("message") ||
errorMessage.includes("parse")
) {
await ProcessingStatusManager.failStage(
sessionId,
ProcessingStage.SESSION_CREATION,
errorMessage
);
} else { } else {
// General failure - mark CSV_IMPORT as failed // General failure - mark CSV_IMPORT as failed
await ProcessingStatusManager.failStage(sessionId, ProcessingStage.CSV_IMPORT, errorMessage); await ProcessingStatusManager.failStage(
sessionId,
ProcessingStage.CSV_IMPORT,
errorMessage
);
} }
} }
@ -270,8 +352,10 @@ async function processSingleImport(importRecord: any): Promise<{ success: boolea
* Process unprocessed SessionImport records into Session records * Process unprocessed SessionImport records into Session records
* Uses new processing status system to find imports that need processing * Uses new processing status system to find imports that need processing
*/ */
export async function processQueuedImports(batchSize: number = 50): Promise<void> { export async function processQueuedImports(
console.log('[Import Processor] Starting to process unprocessed imports...'); batchSize: number = 50
): Promise<void> {
console.log("[Import Processor] Starting to process unprocessed imports...");
let totalSuccessCount = 0; let totalSuccessCount = 0;
let totalErrorCount = 0; let totalErrorCount = 0;
@ -285,20 +369,24 @@ export async function processQueuedImports(batchSize: number = 50): Promise<void
}, },
take: batchSize, take: batchSize,
orderBy: { orderBy: {
createdAt: 'asc', // Process oldest first createdAt: "asc", // Process oldest first
}, },
}); });
if (unprocessedImports.length === 0) { if (unprocessedImports.length === 0) {
if (batchNumber === 1) { if (batchNumber === 1) {
console.log('[Import Processor] No unprocessed imports found'); console.log("[Import Processor] No unprocessed imports found");
} else { } else {
console.log(`[Import Processor] All batches completed. Total: ${totalSuccessCount} successful, ${totalErrorCount} failed`); console.log(
`[Import Processor] All batches completed. Total: ${totalSuccessCount} successful, ${totalErrorCount} failed`
);
} }
return; return;
} }
console.log(`[Import Processor] Processing batch ${batchNumber}: ${unprocessedImports.length} imports...`); console.log(
`[Import Processor] Processing batch ${batchNumber}: ${unprocessedImports.length} imports...`
);
let batchSuccessCount = 0; let batchSuccessCount = 0;
let batchErrorCount = 0; let batchErrorCount = 0;
@ -310,20 +398,28 @@ export async function processQueuedImports(batchSize: number = 50): Promise<void
if (result.success) { if (result.success) {
batchSuccessCount++; batchSuccessCount++;
totalSuccessCount++; totalSuccessCount++;
console.log(`[Import Processor] ✓ Processed import ${importRecord.externalSessionId}`); console.log(
`[Import Processor] ✓ Processed import ${importRecord.externalSessionId}`
);
} else { } else {
batchErrorCount++; batchErrorCount++;
totalErrorCount++; totalErrorCount++;
console.log(`[Import Processor] ✗ Failed to process import ${importRecord.externalSessionId}: ${result.error}`); console.log(
`[Import Processor] ✗ Failed to process import ${importRecord.externalSessionId}: ${result.error}`
);
} }
} }
console.log(`[Import Processor] Batch ${batchNumber} completed: ${batchSuccessCount} successful, ${batchErrorCount} failed`); console.log(
`[Import Processor] Batch ${batchNumber} completed: ${batchSuccessCount} successful, ${batchErrorCount} failed`
);
batchNumber++; batchNumber++;
// If this batch was smaller than the batch size, we're done // If this batch was smaller than the batch size, we're done
if (unprocessedImports.length < batchSize) { if (unprocessedImports.length < batchSize) {
console.log(`[Import Processor] All batches completed. Total: ${totalSuccessCount} successful, ${totalErrorCount} failed`); console.log(
`[Import Processor] All batches completed. Total: ${totalSuccessCount} successful, ${totalErrorCount} failed`
);
return; return;
} }
} }
@ -336,15 +432,20 @@ export function startImportProcessingScheduler(): void {
const config = getSchedulerConfig(); const config = getSchedulerConfig();
if (!config.enabled) { if (!config.enabled) {
console.log('[Import Processing Scheduler] Disabled via configuration'); console.log("[Import Processing Scheduler] Disabled via configuration");
return; return;
} }
// Use a more frequent interval for import processing (every 5 minutes by default) // Use a more frequent interval for import processing (every 5 minutes by default)
const interval = process.env.IMPORT_PROCESSING_INTERVAL || '*/5 * * * *'; const interval = process.env.IMPORT_PROCESSING_INTERVAL || "*/5 * * * *";
const batchSize = parseInt(process.env.IMPORT_PROCESSING_BATCH_SIZE || '50', 10); const batchSize = parseInt(
process.env.IMPORT_PROCESSING_BATCH_SIZE || "50",
10
);
console.log(`[Import Processing Scheduler] Starting with interval: ${interval}`); console.log(
`[Import Processing Scheduler] Starting with interval: ${interval}`
);
console.log(`[Import Processing Scheduler] Batch size: ${batchSize}`); console.log(`[Import Processing Scheduler] Batch size: ${batchSize}`);
cron.schedule(interval, async () => { cron.schedule(interval, async () => {

View File

@ -359,7 +359,7 @@ export function sessionMetrics(
// Track hourly usage for peak time calculation // Track hourly usage for peak time calculation
if (session.startTime) { if (session.startTime) {
const hour = new Date(session.startTime).getHours(); const hour = new Date(session.startTime).getHours();
const hourKey = `${hour.toString().padStart(2, '0')}:00`; const hourKey = `${hour.toString().padStart(2, "0")}:00`;
hourlySessionCounts[hourKey] = (hourlySessionCounts[hourKey] || 0) + 1; hourlySessionCounts[hourKey] = (hourlySessionCounts[hourKey] || 0) + 1;
} }
@ -493,12 +493,16 @@ export function sessionMetrics(
// 1. Extract questions from user messages (if available) // 1. Extract questions from user messages (if available)
if (session.messages) { if (session.messages) {
session.messages session.messages
.filter(msg => msg.role === 'User') .filter((msg) => msg.role === "User")
.forEach(msg => { .forEach((msg) => {
const content = msg.content.trim(); const content = msg.content.trim();
// Simple heuristic: if message ends with ? or contains question words, treat as question // Simple heuristic: if message ends with ? or contains question words, treat as question
if (content.endsWith('?') || if (
/\b(what|when|where|why|how|who|which|can|could|would|will|is|are|do|does|did)\b/i.test(content)) { content.endsWith("?") ||
/\b(what|when|where|why|how|who|which|can|could|would|will|is|are|do|does|did)\b/i.test(
content
)
) {
questionCounts[content] = (questionCounts[content] || 0) + 1; questionCounts[content] = (questionCounts[content] || 0) + 1;
} }
}); });
@ -507,8 +511,12 @@ export function sessionMetrics(
// 3. Extract questions from initial message as fallback // 3. Extract questions from initial message as fallback
if (session.initialMsg) { if (session.initialMsg) {
const content = session.initialMsg.trim(); const content = session.initialMsg.trim();
if (content.endsWith('?') || if (
/\b(what|when|where|why|how|who|which|can|could|would|will|is|are|do|does|did)\b/i.test(content)) { content.endsWith("?") ||
/\b(what|when|where|why|how|who|which|can|could|would|will|is|are|do|does|did)\b/i.test(
content
)
) {
questionCounts[content] = (questionCounts[content] || 0) + 1; questionCounts[content] = (questionCounts[content] || 0) + 1;
} }
} }
@ -580,20 +588,23 @@ export function sessionMetrics(
// Calculate new metrics // Calculate new metrics
// 1. Average Daily Costs (euros) // 1. Average Daily Costs (euros)
const avgDailyCosts = numDaysWithSessions > 0 ? totalTokensEur / numDaysWithSessions : 0; const avgDailyCosts =
numDaysWithSessions > 0 ? totalTokensEur / numDaysWithSessions : 0;
// 2. Peak Usage Time // 2. Peak Usage Time
let peakUsageTime = "N/A"; let peakUsageTime = "N/A";
if (Object.keys(hourlySessionCounts).length > 0) { if (Object.keys(hourlySessionCounts).length > 0) {
const peakHour = Object.entries(hourlySessionCounts) const peakHour = Object.entries(hourlySessionCounts).sort(
.sort(([, a], [, b]) => b - a)[0][0]; ([, a], [, b]) => b - a
const peakHourNum = parseInt(peakHour.split(':')[0]); )[0][0];
const peakHourNum = parseInt(peakHour.split(":")[0]);
const endHour = (peakHourNum + 1) % 24; const endHour = (peakHourNum + 1) % 24;
peakUsageTime = `${peakHour}-${endHour.toString().padStart(2, '0')}:00`; peakUsageTime = `${peakHour}-${endHour.toString().padStart(2, "0")}:00`;
} }
// 3. Resolved Chats Percentage // 3. Resolved Chats Percentage
const resolvedChatsPercentage = totalSessions > 0 ? (resolvedChatsCount / totalSessions) * 100 : 0; const resolvedChatsPercentage =
totalSessions > 0 ? (resolvedChatsCount / totalSessions) * 100 : 0;
// 4. Top 5 Asked Questions // 4. Top 5 Asked Questions
const topQuestions: TopQuestion[] = Object.entries(questionCounts) const topQuestions: TopQuestion[] = Object.entries(questionCounts)

View File

@ -1,6 +1,11 @@
// Enhanced session processing scheduler with AI cost tracking and question management // Enhanced session processing scheduler with AI cost tracking and question management
import cron from "node-cron"; import cron from "node-cron";
import { PrismaClient, SentimentCategory, SessionCategory, ProcessingStage } from "@prisma/client"; import {
PrismaClient,
SentimentCategory,
SessionCategory,
ProcessingStage,
} from "@prisma/client";
import fetch from "node-fetch"; import fetch from "node-fetch";
import { getSchedulerConfig } from "./schedulerConfig"; import { getSchedulerConfig } from "./schedulerConfig";
import { ProcessingStatusManager } from "./processingStatusManager"; import { ProcessingStatusManager } from "./processingStatusManager";
@ -44,10 +49,10 @@ async function getCurrentModelPricing(modelName: string): Promise<{
effectiveFrom: { lte: new Date() }, effectiveFrom: { lte: new Date() },
OR: [ OR: [
{ effectiveUntil: null }, { effectiveUntil: null },
{ effectiveUntil: { gte: new Date() } } { effectiveUntil: { gte: new Date() } },
] ],
}, },
orderBy: { effectiveFrom: 'desc' }, orderBy: { effectiveFrom: "desc" },
take: 1, take: 1,
}, },
}, },
@ -69,7 +74,20 @@ interface ProcessedData {
sentiment: "POSITIVE" | "NEUTRAL" | "NEGATIVE"; sentiment: "POSITIVE" | "NEUTRAL" | "NEGATIVE";
escalated: boolean; escalated: boolean;
forwarded_hr: boolean; forwarded_hr: boolean;
category: "SCHEDULE_HOURS" | "LEAVE_VACATION" | "SICK_LEAVE_RECOVERY" | "SALARY_COMPENSATION" | "CONTRACT_HOURS" | "ONBOARDING" | "OFFBOARDING" | "WORKWEAR_STAFF_PASS" | "TEAM_CONTACTS" | "PERSONAL_QUESTIONS" | "ACCESS_LOGIN" | "SOCIAL_QUESTIONS" | "UNRECOGNIZED_OTHER"; category:
| "SCHEDULE_HOURS"
| "LEAVE_VACATION"
| "SICK_LEAVE_RECOVERY"
| "SALARY_COMPENSATION"
| "CONTRACT_HOURS"
| "ONBOARDING"
| "OFFBOARDING"
| "WORKWEAR_STAFF_PASS"
| "TEAM_CONTACTS"
| "PERSONAL_QUESTIONS"
| "ACCESS_LOGIN"
| "SOCIAL_QUESTIONS"
| "UNRECOGNIZED_OTHER";
questions: string[]; questions: string[];
summary: string; summary: string;
session_id: string; session_id: string;
@ -87,7 +105,7 @@ interface ProcessingResult {
async function recordAIProcessingRequest( async function recordAIProcessingRequest(
sessionId: string, sessionId: string,
openaiResponse: any, openaiResponse: any,
processingType: string = 'session_analysis' processingType: string = "session_analysis"
): Promise<void> { ): Promise<void> {
const usage = openaiResponse.usage; const usage = openaiResponse.usage;
const model = openaiResponse.model; const model = openaiResponse.model;
@ -104,7 +122,8 @@ async function recordAIProcessingRequest(
const finalPricing = pricing || fallbackPricing; const finalPricing = pricing || fallbackPricing;
const promptCost = usage.prompt_tokens * finalPricing.promptTokenCost; const promptCost = usage.prompt_tokens * finalPricing.promptTokenCost;
const completionCost = usage.completion_tokens * finalPricing.completionTokenCost; const completionCost =
usage.completion_tokens * finalPricing.completionTokenCost;
const totalCostUsd = promptCost + completionCost; const totalCostUsd = promptCost + completionCost;
const totalCostEur = totalCostUsd * USD_TO_EUR_RATE; const totalCostEur = totalCostUsd * USD_TO_EUR_RATE;
@ -123,10 +142,14 @@ async function recordAIProcessingRequest(
// Detailed breakdown // Detailed breakdown
cachedTokens: usage.prompt_tokens_details?.cached_tokens || null, cachedTokens: usage.prompt_tokens_details?.cached_tokens || null,
audioTokensPrompt: usage.prompt_tokens_details?.audio_tokens || null, audioTokensPrompt: usage.prompt_tokens_details?.audio_tokens || null,
reasoningTokens: usage.completion_tokens_details?.reasoning_tokens || null, reasoningTokens:
audioTokensCompletion: usage.completion_tokens_details?.audio_tokens || null, usage.completion_tokens_details?.reasoning_tokens || null,
acceptedPredictionTokens: usage.completion_tokens_details?.accepted_prediction_tokens || null, audioTokensCompletion:
rejectedPredictionTokens: usage.completion_tokens_details?.rejected_prediction_tokens || null, usage.completion_tokens_details?.audio_tokens || null,
acceptedPredictionTokens:
usage.completion_tokens_details?.accepted_prediction_tokens || null,
rejectedPredictionTokens:
usage.completion_tokens_details?.rejected_prediction_tokens || null,
promptTokenCost: finalPricing.promptTokenCost, promptTokenCost: finalPricing.promptTokenCost,
completionTokenCost: finalPricing.completionTokenCost, completionTokenCost: finalPricing.completionTokenCost,
@ -135,7 +158,7 @@ async function recordAIProcessingRequest(
processingType, processingType,
success: true, success: true,
completedAt: new Date(), completedAt: new Date(),
} },
}); });
} }
@ -150,7 +173,7 @@ async function recordFailedAIProcessingRequest(
await prisma.aIProcessingRequest.create({ await prisma.aIProcessingRequest.create({
data: { data: {
sessionId, sessionId,
model: 'unknown', model: "unknown",
promptTokens: 0, promptTokens: 0,
completionTokens: 0, completionTokens: 0,
totalTokens: 0, totalTokens: 0,
@ -161,17 +184,20 @@ async function recordFailedAIProcessingRequest(
success: false, success: false,
errorMessage, errorMessage,
completedAt: new Date(), completedAt: new Date(),
} },
}); });
} }
/** /**
* Process questions into separate Question and SessionQuestion tables * Process questions into separate Question and SessionQuestion tables
*/ */
async function processQuestions(sessionId: string, questions: string[]): Promise<void> { async function processQuestions(
sessionId: string,
questions: string[]
): Promise<void> {
// Clear existing questions for this session // Clear existing questions for this session
await prisma.sessionQuestion.deleteMany({ await prisma.sessionQuestion.deleteMany({
where: { sessionId } where: { sessionId },
}); });
// Process each question // Process each question
@ -183,7 +209,7 @@ async function processQuestions(sessionId: string, questions: string[]): Promise
const question = await prisma.question.upsert({ const question = await prisma.question.upsert({
where: { content: questionText.trim() }, where: { content: questionText.trim() },
create: { content: questionText.trim() }, create: { content: questionText.trim() },
update: {} update: {},
}); });
// Link to session // Link to session
@ -191,8 +217,8 @@ async function processQuestions(sessionId: string, questions: string[]): Promise
data: { data: {
sessionId, sessionId,
questionId: question.id, questionId: question.id,
order: index order: index,
} },
}); });
} }
} }
@ -204,8 +230,8 @@ async function calculateMessagesSent(sessionId: string): Promise<number> {
const userMessageCount = await prisma.message.count({ const userMessageCount = await prisma.message.count({
where: { where: {
sessionId, sessionId,
role: { in: ['user', 'User'] } // Handle both cases role: { in: ["user", "User"] }, // Handle both cases
} },
}); });
return userMessageCount; return userMessageCount;
} }
@ -213,10 +239,13 @@ async function calculateMessagesSent(sessionId: string): Promise<number> {
/** /**
* Calculate endTime from latest Message timestamp * Calculate endTime from latest Message timestamp
*/ */
async function calculateEndTime(sessionId: string, fallbackEndTime: Date): Promise<Date> { async function calculateEndTime(
sessionId: string,
fallbackEndTime: Date
): Promise<Date> {
const latestMessage = await prisma.message.findFirst({ const latestMessage = await prisma.message.findFirst({
where: { sessionId }, where: { sessionId },
orderBy: { timestamp: 'desc' } orderBy: { timestamp: "desc" },
}); });
return latestMessage?.timestamp || fallbackEndTime; return latestMessage?.timestamp || fallbackEndTime;
@ -225,7 +254,11 @@ async function calculateEndTime(sessionId: string, fallbackEndTime: Date): Promi
/** /**
* Processes a session transcript using OpenAI API * Processes a session transcript using OpenAI API
*/ */
async function processTranscriptWithOpenAI(sessionId: string, transcript: string, companyId: string): Promise<ProcessedData> { async function processTranscriptWithOpenAI(
sessionId: string,
transcript: string,
companyId: string
): Promise<ProcessedData> {
if (!OPENAI_API_KEY) { if (!OPENAI_API_KEY) {
throw new Error("OPENAI_API_KEY environment variable is not set"); throw new Error("OPENAI_API_KEY environment variable is not set");
} }
@ -293,7 +326,11 @@ async function processTranscriptWithOpenAI(sessionId: string, transcript: string
const openaiResponse: any = await response.json(); const openaiResponse: any = await response.json();
// Record the AI processing request for cost tracking // Record the AI processing request for cost tracking
await recordAIProcessingRequest(sessionId, openaiResponse, 'session_analysis'); await recordAIProcessingRequest(
sessionId,
openaiResponse,
"session_analysis"
);
const processedData = JSON.parse(openaiResponse.choices[0].message.content); const processedData = JSON.parse(openaiResponse.choices[0].message.content);
@ -305,7 +342,7 @@ async function processTranscriptWithOpenAI(sessionId: string, transcript: string
// Record failed request // Record failed request
await recordFailedAIProcessingRequest( await recordFailedAIProcessingRequest(
sessionId, sessionId,
'session_analysis', "session_analysis",
error instanceof Error ? error.message : String(error) error instanceof Error ? error.message : String(error)
); );
@ -319,8 +356,14 @@ async function processTranscriptWithOpenAI(sessionId: string, transcript: string
*/ */
function validateOpenAIResponse(data: any): void { function validateOpenAIResponse(data: any): void {
const requiredFields = [ const requiredFields = [
"language", "sentiment", "escalated", "forwarded_hr", "language",
"category", "questions", "summary", "session_id" "sentiment",
"escalated",
"forwarded_hr",
"category",
"questions",
"summary",
"session_id",
]; ];
for (const field of requiredFields) { for (const field of requiredFields) {
@ -331,11 +374,15 @@ function validateOpenAIResponse(data: any): void {
// Validate field types and values // Validate field types and values
if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) { if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) {
throw new Error("Invalid language format. Expected ISO 639-1 code (e.g., 'en')"); throw new Error(
"Invalid language format. Expected ISO 639-1 code (e.g., 'en')"
);
} }
if (!["POSITIVE", "NEUTRAL", "NEGATIVE"].includes(data.sentiment)) { if (!["POSITIVE", "NEUTRAL", "NEGATIVE"].includes(data.sentiment)) {
throw new Error("Invalid sentiment. Expected 'POSITIVE', 'NEUTRAL', or 'NEGATIVE'"); throw new Error(
"Invalid sentiment. Expected 'POSITIVE', 'NEUTRAL', or 'NEGATIVE'"
);
} }
if (typeof data.escalated !== "boolean") { if (typeof data.escalated !== "boolean") {
@ -347,22 +394,39 @@ function validateOpenAIResponse(data: any): void {
} }
const validCategories = [ const validCategories = [
"SCHEDULE_HOURS", "LEAVE_VACATION", "SICK_LEAVE_RECOVERY", "SALARY_COMPENSATION", "SCHEDULE_HOURS",
"CONTRACT_HOURS", "ONBOARDING", "OFFBOARDING", "WORKWEAR_STAFF_PASS", "LEAVE_VACATION",
"TEAM_CONTACTS", "PERSONAL_QUESTIONS", "ACCESS_LOGIN", "SOCIAL_QUESTIONS", "SICK_LEAVE_RECOVERY",
"UNRECOGNIZED_OTHER" "SALARY_COMPENSATION",
"CONTRACT_HOURS",
"ONBOARDING",
"OFFBOARDING",
"WORKWEAR_STAFF_PASS",
"TEAM_CONTACTS",
"PERSONAL_QUESTIONS",
"ACCESS_LOGIN",
"SOCIAL_QUESTIONS",
"UNRECOGNIZED_OTHER",
]; ];
if (!validCategories.includes(data.category)) { if (!validCategories.includes(data.category)) {
throw new Error(`Invalid category. Expected one of: ${validCategories.join(", ")}`); throw new Error(
`Invalid category. Expected one of: ${validCategories.join(", ")}`
);
} }
if (!Array.isArray(data.questions)) { if (!Array.isArray(data.questions)) {
throw new Error("Invalid questions. Expected array of strings"); throw new Error("Invalid questions. Expected array of strings");
} }
if (typeof data.summary !== "string" || data.summary.length < 10 || data.summary.length > 300) { if (
throw new Error("Invalid summary. Expected string between 10-300 characters"); typeof data.summary !== "string" ||
data.summary.length < 10 ||
data.summary.length > 300
) {
throw new Error(
"Invalid summary. Expected string between 10-300 characters"
);
} }
if (typeof data.session_id !== "string") { if (typeof data.session_id !== "string") {
@ -384,11 +448,15 @@ async function processSingleSession(session: any): Promise<ProcessingResult> {
try { try {
// Mark AI analysis as started // Mark AI analysis as started
await ProcessingStatusManager.startStage(session.id, ProcessingStage.AI_ANALYSIS); await ProcessingStatusManager.startStage(
session.id,
ProcessingStage.AI_ANALYSIS
);
// Convert messages back to transcript format for OpenAI processing // Convert messages back to transcript format for OpenAI processing
const transcript = session.messages const transcript = session.messages
.map((msg: any) => .map(
(msg: any) =>
`[${new Date(msg.timestamp) `[${new Date(msg.timestamp)
.toLocaleString("en-GB", { .toLocaleString("en-GB", {
day: "2-digit", day: "2-digit",
@ -402,13 +470,20 @@ async function processSingleSession(session: any): Promise<ProcessingResult> {
) )
.join("\n"); .join("\n");
const processedData = await processTranscriptWithOpenAI(session.id, transcript, session.companyId); const processedData = await processTranscriptWithOpenAI(
session.id,
transcript,
session.companyId
);
// Calculate messagesSent from actual Message records // Calculate messagesSent from actual Message records
const messagesSent = await calculateMessagesSent(session.id); const messagesSent = await calculateMessagesSent(session.id);
// Calculate endTime from latest Message timestamp // Calculate endTime from latest Message timestamp
const calculatedEndTime = await calculateEndTime(session.id, session.endTime); const calculatedEndTime = await calculateEndTime(
session.id,
session.endTime
);
// Update the session with processed data // Update the session with processed data
await prisma.session.update({ await prisma.session.update({
@ -426,23 +501,34 @@ async function processSingleSession(session: any): Promise<ProcessingResult> {
}); });
// Mark AI analysis as completed // Mark AI analysis as completed
await ProcessingStatusManager.completeStage(session.id, ProcessingStage.AI_ANALYSIS, { await ProcessingStatusManager.completeStage(
session.id,
ProcessingStage.AI_ANALYSIS,
{
language: processedData.language, language: processedData.language,
sentiment: processedData.sentiment, sentiment: processedData.sentiment,
category: processedData.category, category: processedData.category,
questionsCount: processedData.questions.length questionsCount: processedData.questions.length,
}); }
);
// Start question extraction stage // Start question extraction stage
await ProcessingStatusManager.startStage(session.id, ProcessingStage.QUESTION_EXTRACTION); await ProcessingStatusManager.startStage(
session.id,
ProcessingStage.QUESTION_EXTRACTION
);
// Process questions into separate tables // Process questions into separate tables
await processQuestions(session.id, processedData.questions); await processQuestions(session.id, processedData.questions);
// Mark question extraction as completed // Mark question extraction as completed
await ProcessingStatusManager.completeStage(session.id, ProcessingStage.QUESTION_EXTRACTION, { await ProcessingStatusManager.completeStage(
questionsProcessed: processedData.questions.length session.id,
}); ProcessingStage.QUESTION_EXTRACTION,
{
questionsProcessed: processedData.questions.length,
}
);
return { return {
sessionId: session.id, sessionId: session.id,
@ -467,7 +553,10 @@ async function processSingleSession(session: any): Promise<ProcessingResult> {
/** /**
* Process sessions in parallel with concurrency limit * Process sessions in parallel with concurrency limit
*/ */
async function processSessionsInParallel(sessions: any[], maxConcurrency: number = 5): Promise<ProcessingResult[]> { async function processSessionsInParallel(
sessions: any[],
maxConcurrency: number = 5
): Promise<ProcessingResult[]> {
const results: Promise<ProcessingResult>[] = []; const results: Promise<ProcessingResult>[] = [];
const executing: Promise<ProcessingResult>[] = []; const executing: Promise<ProcessingResult>[] = [];
@ -486,7 +575,7 @@ async function processSessionsInParallel(sessions: any[], maxConcurrency: number
if (executing.length >= maxConcurrency) { if (executing.length >= maxConcurrency) {
await Promise.race(executing); await Promise.race(executing);
const completedIndex = executing.findIndex(p => p === promise); const completedIndex = executing.findIndex((p) => p === promise);
if (completedIndex !== -1) { if (completedIndex !== -1) {
executing.splice(completedIndex, 1); executing.splice(completedIndex, 1);
} }
@ -499,27 +588,37 @@ async function processSessionsInParallel(sessions: any[], maxConcurrency: number
/** /**
* Process unprocessed sessions using the new processing status system * Process unprocessed sessions using the new processing status system
*/ */
export async function processUnprocessedSessions(batchSize: number | null = null, maxConcurrency: number = 5): Promise<void> { export async function processUnprocessedSessions(
process.stdout.write("[ProcessingScheduler] Starting to process sessions needing AI analysis...\n"); batchSize: number | null = null,
maxConcurrency: number = 5
): Promise<void> {
process.stdout.write(
"[ProcessingScheduler] Starting to process sessions needing AI analysis...\n"
);
// Get sessions that need AI processing using the new status system // Get sessions that need AI processing using the new status system
const sessionsNeedingAI = await ProcessingStatusManager.getSessionsNeedingProcessing( const sessionsNeedingAI =
await ProcessingStatusManager.getSessionsNeedingProcessing(
ProcessingStage.AI_ANALYSIS, ProcessingStage.AI_ANALYSIS,
batchSize || 50 batchSize || 50
); );
if (sessionsNeedingAI.length === 0) { if (sessionsNeedingAI.length === 0) {
process.stdout.write("[ProcessingScheduler] No sessions found requiring AI processing.\n"); process.stdout.write(
"[ProcessingScheduler] No sessions found requiring AI processing.\n"
);
return; return;
} }
// Get session IDs that need processing // Get session IDs that need processing
const sessionIds = sessionsNeedingAI.map(statusRecord => statusRecord.sessionId); const sessionIds = sessionsNeedingAI.map(
(statusRecord) => statusRecord.sessionId
);
// Fetch full session data with messages // Fetch full session data with messages
const sessionsToProcess = await prisma.session.findMany({ const sessionsToProcess = await prisma.session.findMany({
where: { where: {
id: { in: sessionIds } id: { in: sessionIds },
}, },
include: { include: {
messages: { messages: {
@ -534,7 +633,9 @@ export async function processUnprocessedSessions(batchSize: number | null = null
); );
if (sessionsWithMessages.length === 0) { if (sessionsWithMessages.length === 0) {
process.stdout.write("[ProcessingScheduler] No sessions with messages found requiring processing.\n"); process.stdout.write(
"[ProcessingScheduler] No sessions with messages found requiring processing.\n"
);
return; return;
} }
@ -543,16 +644,25 @@ export async function processUnprocessedSessions(batchSize: number | null = null
); );
const startTime = Date.now(); const startTime = Date.now();
const results = await processSessionsInParallel(sessionsWithMessages, maxConcurrency); const results = await processSessionsInParallel(
sessionsWithMessages,
maxConcurrency
);
const endTime = Date.now(); const endTime = Date.now();
const successCount = results.filter((r) => r.success).length; const successCount = results.filter((r) => r.success).length;
const errorCount = results.filter((r) => !r.success).length; const errorCount = results.filter((r) => !r.success).length;
process.stdout.write("[ProcessingScheduler] Session processing complete.\n"); process.stdout.write("[ProcessingScheduler] Session processing complete.\n");
process.stdout.write(`[ProcessingScheduler] Successfully processed: ${successCount} sessions.\n`); process.stdout.write(
process.stdout.write(`[ProcessingScheduler] Failed to process: ${errorCount} sessions.\n`); `[ProcessingScheduler] Successfully processed: ${successCount} sessions.\n`
process.stdout.write(`[ProcessingScheduler] Total processing time: ${((endTime - startTime) / 1000).toFixed(2)}s\n`); );
process.stdout.write(
`[ProcessingScheduler] Failed to process: ${errorCount} sessions.\n`
);
process.stdout.write(
`[ProcessingScheduler] Total processing time: ${((endTime - startTime) / 1000).toFixed(2)}s\n`
);
} }
/** /**
@ -576,11 +686,11 @@ export async function getAIProcessingCosts(): Promise<{
}); });
const successfulRequests = await prisma.aIProcessingRequest.count({ const successfulRequests = await prisma.aIProcessingRequest.count({
where: { success: true } where: { success: true },
}); });
const failedRequests = await prisma.aIProcessingRequest.count({ const failedRequests = await prisma.aIProcessingRequest.count({
where: { success: false } where: { success: false },
}); });
return { return {
@ -599,22 +709,32 @@ export function startProcessingScheduler(): void {
const config = getSchedulerConfig(); const config = getSchedulerConfig();
if (!config.enabled) { if (!config.enabled) {
console.log('[Processing Scheduler] Disabled via configuration'); console.log("[Processing Scheduler] Disabled via configuration");
return; return;
} }
console.log(`[Processing Scheduler] Starting with interval: ${config.sessionProcessing.interval}`); console.log(
console.log(`[Processing Scheduler] Batch size: ${config.sessionProcessing.batchSize === 0 ? 'unlimited' : config.sessionProcessing.batchSize}`); `[Processing Scheduler] Starting with interval: ${config.sessionProcessing.interval}`
console.log(`[Processing Scheduler] Concurrency: ${config.sessionProcessing.concurrency}`); );
console.log(
`[Processing Scheduler] Batch size: ${config.sessionProcessing.batchSize === 0 ? "unlimited" : config.sessionProcessing.batchSize}`
);
console.log(
`[Processing Scheduler] Concurrency: ${config.sessionProcessing.concurrency}`
);
cron.schedule(config.sessionProcessing.interval, async () => { cron.schedule(config.sessionProcessing.interval, async () => {
try { try {
await processUnprocessedSessions( await processUnprocessedSessions(
config.sessionProcessing.batchSize === 0 ? null : config.sessionProcessing.batchSize, config.sessionProcessing.batchSize === 0
? null
: config.sessionProcessing.batchSize,
config.sessionProcessing.concurrency config.sessionProcessing.concurrency
); );
} catch (error) { } catch (error) {
process.stderr.write(`[ProcessingScheduler] Error in scheduler: ${error}\n`); process.stderr.write(
`[ProcessingScheduler] Error in scheduler: ${error}\n`
);
} }
}); });
} }

View File

@ -1,4 +1,8 @@
import { PrismaClient, ProcessingStage, ProcessingStatus } from '@prisma/client'; import {
PrismaClient,
ProcessingStage,
ProcessingStatus,
} from "@prisma/client";
const prisma = new PrismaClient(); const prisma = new PrismaClient();
@ -6,7 +10,6 @@ const prisma = new PrismaClient();
* Centralized processing status management * Centralized processing status management
*/ */
export class ProcessingStatusManager { export class ProcessingStatusManager {
/** /**
* Initialize processing status for a session with all stages set to PENDING * Initialize processing status for a session with all stages set to PENDING
*/ */
@ -21,7 +24,7 @@ export class ProcessingStatusManager {
// Create all processing status records for this session // Create all processing status records for this session
await prisma.sessionProcessingStatus.createMany({ await prisma.sessionProcessingStatus.createMany({
data: stages.map(stage => ({ data: stages.map((stage) => ({
sessionId, sessionId,
stage, stage,
status: ProcessingStatus.PENDING, status: ProcessingStatus.PENDING,
@ -40,7 +43,7 @@ export class ProcessingStatusManager {
): Promise<void> { ): Promise<void> {
await prisma.sessionProcessingStatus.upsert({ await prisma.sessionProcessingStatus.upsert({
where: { where: {
sessionId_stage: { sessionId, stage } sessionId_stage: { sessionId, stage },
}, },
update: { update: {
status: ProcessingStatus.IN_PROGRESS, status: ProcessingStatus.IN_PROGRESS,
@ -68,7 +71,7 @@ export class ProcessingStatusManager {
): Promise<void> { ): Promise<void> {
await prisma.sessionProcessingStatus.upsert({ await prisma.sessionProcessingStatus.upsert({
where: { where: {
sessionId_stage: { sessionId, stage } sessionId_stage: { sessionId, stage },
}, },
update: { update: {
status: ProcessingStatus.COMPLETED, status: ProcessingStatus.COMPLETED,
@ -98,7 +101,7 @@ export class ProcessingStatusManager {
): Promise<void> { ): Promise<void> {
await prisma.sessionProcessingStatus.upsert({ await prisma.sessionProcessingStatus.upsert({
where: { where: {
sessionId_stage: { sessionId, stage } sessionId_stage: { sessionId, stage },
}, },
update: { update: {
status: ProcessingStatus.FAILED, status: ProcessingStatus.FAILED,
@ -130,7 +133,7 @@ export class ProcessingStatusManager {
): Promise<void> { ): Promise<void> {
await prisma.sessionProcessingStatus.upsert({ await prisma.sessionProcessingStatus.upsert({
where: { where: {
sessionId_stage: { sessionId, stage } sessionId_stage: { sessionId, stage },
}, },
update: { update: {
status: ProcessingStatus.SKIPPED, status: ProcessingStatus.SKIPPED,
@ -154,7 +157,7 @@ export class ProcessingStatusManager {
static async getSessionStatus(sessionId: string) { static async getSessionStatus(sessionId: string) {
return await prisma.sessionProcessingStatus.findMany({ return await prisma.sessionProcessingStatus.findMany({
where: { sessionId }, where: { sessionId },
orderBy: { stage: 'asc' }, orderBy: { stage: "asc" },
}); });
} }
@ -179,7 +182,7 @@ export class ProcessingStatusManager {
}, },
}, },
take: limit, take: limit,
orderBy: { session: { createdAt: 'asc' } }, orderBy: { session: { createdAt: "asc" } },
}); });
} }
@ -189,7 +192,7 @@ export class ProcessingStatusManager {
static async getPipelineStatus() { static async getPipelineStatus() {
// Get counts by stage and status // Get counts by stage and status
const statusCounts = await prisma.sessionProcessingStatus.groupBy({ const statusCounts = await prisma.sessionProcessingStatus.groupBy({
by: ['stage', 'status'], by: ["stage", "status"],
_count: { id: true }, _count: { id: true },
}); });
@ -233,17 +236,20 @@ export class ProcessingStatusManager {
}, },
}, },
}, },
orderBy: { completedAt: 'desc' }, orderBy: { completedAt: "desc" },
}); });
} }
/** /**
* Reset a failed stage for retry * Reset a failed stage for retry
*/ */
static async resetStageForRetry(sessionId: string, stage: ProcessingStage): Promise<void> { static async resetStageForRetry(
sessionId: string,
stage: ProcessingStage
): Promise<void> {
await prisma.sessionProcessingStatus.update({ await prisma.sessionProcessingStatus.update({
where: { where: {
sessionId_stage: { sessionId, stage } sessionId_stage: { sessionId, stage },
}, },
data: { data: {
status: ProcessingStatus.PENDING, status: ProcessingStatus.PENDING,
@ -257,10 +263,13 @@ export class ProcessingStatusManager {
/** /**
* Check if a session has completed a specific stage * Check if a session has completed a specific stage
*/ */
static async hasCompletedStage(sessionId: string, stage: ProcessingStage): Promise<boolean> { static async hasCompletedStage(
sessionId: string,
stage: ProcessingStage
): Promise<boolean> {
const status = await prisma.sessionProcessingStatus.findUnique({ const status = await prisma.sessionProcessingStatus.findUnique({
where: { where: {
sessionId_stage: { sessionId, stage } sessionId_stage: { sessionId, stage },
}, },
}); });
@ -270,7 +279,10 @@ export class ProcessingStatusManager {
/** /**
* Check if a session is ready for a specific stage (previous stages completed) * Check if a session is ready for a specific stage (previous stages completed)
*/ */
static async isReadyForStage(sessionId: string, stage: ProcessingStage): Promise<boolean> { static async isReadyForStage(
sessionId: string,
stage: ProcessingStage
): Promise<boolean> {
const stageOrder = [ const stageOrder = [
ProcessingStage.CSV_IMPORT, ProcessingStage.CSV_IMPORT,
ProcessingStage.TRANSCRIPT_FETCH, ProcessingStage.TRANSCRIPT_FETCH,

View File

@ -8,11 +8,13 @@ export function startCsvImportScheduler() {
const config = getSchedulerConfig(); const config = getSchedulerConfig();
if (!config.enabled) { if (!config.enabled) {
console.log('[CSV Import Scheduler] Disabled via configuration'); console.log("[CSV Import Scheduler] Disabled via configuration");
return; return;
} }
console.log(`[CSV Import Scheduler] Starting with interval: ${config.csvImport.interval}`); console.log(
`[CSV Import Scheduler] Starting with interval: ${config.csvImport.interval}`
);
cron.schedule(config.csvImport.interval, async () => { cron.schedule(config.csvImport.interval, async () => {
const companies = await prisma.company.findMany(); const companies = await prisma.company.findMany();

View File

@ -1,7 +1,10 @@
// Legacy scheduler configuration - now uses centralized env management // Legacy scheduler configuration - now uses centralized env management
// This file is kept for backward compatibility but delegates to lib/env.ts // This file is kept for backward compatibility but delegates to lib/env.ts
import { getSchedulerConfig as getEnvSchedulerConfig, logEnvConfig } from "./env"; import {
getSchedulerConfig as getEnvSchedulerConfig,
logEnvConfig,
} from "./env";
export interface SchedulerConfig { export interface SchedulerConfig {
enabled: boolean; enabled: boolean;

View File

@ -23,7 +23,7 @@ export async function fetchTranscriptContent(
if (!url || !url.trim()) { if (!url || !url.trim()) {
return { return {
success: false, success: false,
error: 'No transcript URL provided', error: "No transcript URL provided",
}; };
} }
@ -34,7 +34,7 @@ export async function fetchTranscriptContent(
: undefined; : undefined;
const headers: Record<string, string> = { const headers: Record<string, string> = {
'User-Agent': 'LiveDash-Transcript-Fetcher/1.0', "User-Agent": "LiveDash-Transcript-Fetcher/1.0",
}; };
if (authHeader) { if (authHeader) {
@ -46,7 +46,7 @@ export async function fetchTranscriptContent(
const timeoutId = setTimeout(() => controller.abort(), 30000); // 30 second timeout const timeoutId = setTimeout(() => controller.abort(), 30000); // 30 second timeout
const response = await fetch(url, { const response = await fetch(url, {
method: 'GET', method: "GET",
headers, headers,
signal: controller.signal, signal: controller.signal,
}); });
@ -65,7 +65,7 @@ export async function fetchTranscriptContent(
if (!content || content.trim().length === 0) { if (!content || content.trim().length === 0) {
return { return {
success: false, success: false,
error: 'Empty transcript content', error: "Empty transcript content",
}; };
} }
@ -73,29 +73,28 @@ export async function fetchTranscriptContent(
success: true, success: true,
content: content.trim(), content: content.trim(),
}; };
} catch (error) { } catch (error) {
const errorMessage = error instanceof Error ? error.message : String(error); const errorMessage = error instanceof Error ? error.message : String(error);
// Handle common network errors // Handle common network errors
if (errorMessage.includes('ENOTFOUND')) { if (errorMessage.includes("ENOTFOUND")) {
return { return {
success: false, success: false,
error: 'Domain not found', error: "Domain not found",
}; };
} }
if (errorMessage.includes('ECONNREFUSED')) { if (errorMessage.includes("ECONNREFUSED")) {
return { return {
success: false, success: false,
error: 'Connection refused', error: "Connection refused",
}; };
} }
if (errorMessage.includes('timeout')) { if (errorMessage.includes("timeout")) {
return { return {
success: false, success: false,
error: 'Request timeout', error: "Request timeout",
}; };
} }
@ -112,13 +111,13 @@ export async function fetchTranscriptContent(
* @returns boolean indicating if URL appears valid * @returns boolean indicating if URL appears valid
*/ */
export function isValidTranscriptUrl(url: string): boolean { export function isValidTranscriptUrl(url: string): boolean {
if (!url || typeof url !== 'string') { if (!url || typeof url !== "string") {
return false; return false;
} }
try { try {
const parsedUrl = new URL(url); const parsedUrl = new URL(url);
return parsedUrl.protocol === 'http:' || parsedUrl.protocol === 'https:'; return parsedUrl.protocol === "http:" || parsedUrl.protocol === "https:";
} catch { } catch {
return false; return false;
} }

View File

@ -1,5 +1,5 @@
// Transcript parsing utility for converting raw transcript content into structured messages // Transcript parsing utility for converting raw transcript content into structured messages
import { prisma } from './prisma.js'; import { prisma } from "./prisma.js";
export interface ParsedMessage { export interface ParsedMessage {
sessionId: string; sessionId: string;
@ -19,7 +19,9 @@ export interface TranscriptParseResult {
* Parse European date format (DD.MM.YYYY HH:mm:ss) to Date object * Parse European date format (DD.MM.YYYY HH:mm:ss) to Date object
*/ */
function parseEuropeanDate(dateStr: string): Date { function parseEuropeanDate(dateStr: string): Date {
const match = dateStr.match(/(\d{2})\.(\d{2})\.(\d{4}) (\d{2}):(\d{2}):(\d{2})/); const match = dateStr.match(
/(\d{2})\.(\d{2})\.(\d{4}) (\d{2}):(\d{2}):(\d{2})/
);
if (!match) { if (!match) {
throw new Error(`Invalid date format: ${dateStr}`); throw new Error(`Invalid date format: ${dateStr}`);
} }
@ -51,13 +53,17 @@ export function parseTranscriptToMessages(
if (!content || !content.trim()) { if (!content || !content.trim()) {
return { return {
success: false, success: false,
error: 'Empty transcript content' error: "Empty transcript content",
}; };
} }
const messages: ParsedMessage[] = []; const messages: ParsedMessage[] = [];
const lines = content.split('\n'); const lines = content.split("\n");
let currentMessage: { role: string; content: string; timestamp?: string } | null = null; let currentMessage: {
role: string;
content: string;
timestamp?: string;
} | null = null;
let order = 0; let order = 0;
for (const line of lines) { for (const line of lines) {
@ -69,56 +75,64 @@ export function parseTranscriptToMessages(
} }
// Check if line starts with a timestamp and role [DD.MM.YYYY HH:MM:SS] Role: content // Check if line starts with a timestamp and role [DD.MM.YYYY HH:MM:SS] Role: content
const timestampRoleMatch = trimmedLine.match(/^\[(\d{2}\.\d{2}\.\d{4} \d{2}:\d{2}:\d{2})\]\s+(User|Assistant|System|user|assistant|system):\s*(.*)$/i); const timestampRoleMatch = trimmedLine.match(
/^\[(\d{2}\.\d{2}\.\d{4} \d{2}:\d{2}:\d{2})\]\s+(User|Assistant|System|user|assistant|system):\s*(.*)$/i
);
// Check if line starts with just a role (User:, Assistant:, System:, etc.) // Check if line starts with just a role (User:, Assistant:, System:, etc.)
const roleMatch = trimmedLine.match(/^(User|Assistant|System|user|assistant|system):\s*(.*)$/i); const roleMatch = trimmedLine.match(
/^(User|Assistant|System|user|assistant|system):\s*(.*)$/i
);
if (timestampRoleMatch) { if (timestampRoleMatch) {
// Save previous message if exists // Save previous message if exists
if (currentMessage) { if (currentMessage) {
messages.push({ messages.push({
sessionId: '', // Will be set by caller sessionId: "", // Will be set by caller
timestamp: new Date(), // Will be calculated below timestamp: new Date(), // Will be calculated below
role: currentMessage.role, role: currentMessage.role,
content: currentMessage.content.trim(), content: currentMessage.content.trim(),
order: order++ order: order++,
}); });
} }
// Start new message with timestamp // Start new message with timestamp
const timestamp = timestampRoleMatch[1]; const timestamp = timestampRoleMatch[1];
const role = timestampRoleMatch[2].charAt(0).toUpperCase() + timestampRoleMatch[2].slice(1).toLowerCase(); const role =
const content = timestampRoleMatch[3] || ''; timestampRoleMatch[2].charAt(0).toUpperCase() +
timestampRoleMatch[2].slice(1).toLowerCase();
const content = timestampRoleMatch[3] || "";
currentMessage = { currentMessage = {
role, role,
content, content,
timestamp // Store the timestamp for later parsing timestamp, // Store the timestamp for later parsing
}; };
} else if (roleMatch) { } else if (roleMatch) {
// Save previous message if exists // Save previous message if exists
if (currentMessage) { if (currentMessage) {
messages.push({ messages.push({
sessionId: '', // Will be set by caller sessionId: "", // Will be set by caller
timestamp: new Date(), // Will be calculated below timestamp: new Date(), // Will be calculated below
role: currentMessage.role, role: currentMessage.role,
content: currentMessage.content.trim(), content: currentMessage.content.trim(),
order: order++ order: order++,
}); });
} }
// Start new message without timestamp // Start new message without timestamp
const role = roleMatch[1].charAt(0).toUpperCase() + roleMatch[1].slice(1).toLowerCase(); const role =
const content = roleMatch[2] || ''; roleMatch[1].charAt(0).toUpperCase() +
roleMatch[1].slice(1).toLowerCase();
const content = roleMatch[2] || "";
currentMessage = { currentMessage = {
role, role,
content content,
}; };
} else if (currentMessage) { } else if (currentMessage) {
// Continue previous message (multi-line) // Continue previous message (multi-line)
currentMessage.content += '\n' + trimmedLine; currentMessage.content += "\n" + trimmedLine;
} }
// If no current message and no role match, skip the line (orphaned content) // If no current message and no role match, skip the line (orphaned content)
} }
@ -126,23 +140,23 @@ export function parseTranscriptToMessages(
// Save the last message // Save the last message
if (currentMessage) { if (currentMessage) {
messages.push({ messages.push({
sessionId: '', // Will be set by caller sessionId: "", // Will be set by caller
timestamp: new Date(), // Will be calculated below timestamp: new Date(), // Will be calculated below
role: currentMessage.role, role: currentMessage.role,
content: currentMessage.content.trim(), content: currentMessage.content.trim(),
order: order++ order: order++,
}); });
} }
if (messages.length === 0) { if (messages.length === 0) {
return { return {
success: false, success: false,
error: 'No messages found in transcript' error: "No messages found in transcript",
}; };
} }
// Calculate timestamps - use parsed timestamps if available, otherwise distribute across session duration // Calculate timestamps - use parsed timestamps if available, otherwise distribute across session duration
const hasTimestamps = messages.some(msg => (msg as any).timestamp); const hasTimestamps = messages.some((msg) => (msg as any).timestamp);
if (hasTimestamps) { if (hasTimestamps) {
// Use parsed timestamps from the transcript // Use parsed timestamps from the transcript
@ -154,35 +168,45 @@ export function parseTranscriptToMessages(
} catch (error) { } catch (error) {
// Fallback to distributed timestamp if parsing fails // Fallback to distributed timestamp if parsing fails
const sessionDurationMs = endTime.getTime() - startTime.getTime(); const sessionDurationMs = endTime.getTime() - startTime.getTime();
const messageInterval = messages.length > 1 ? sessionDurationMs / (messages.length - 1) : 0; const messageInterval =
message.timestamp = new Date(startTime.getTime() + (index * messageInterval)); messages.length > 1
? sessionDurationMs / (messages.length - 1)
: 0;
message.timestamp = new Date(
startTime.getTime() + index * messageInterval
);
} }
} else { } else {
// Fallback to distributed timestamp // Fallback to distributed timestamp
const sessionDurationMs = endTime.getTime() - startTime.getTime(); const sessionDurationMs = endTime.getTime() - startTime.getTime();
const messageInterval = messages.length > 1 ? sessionDurationMs / (messages.length - 1) : 0; const messageInterval =
message.timestamp = new Date(startTime.getTime() + (index * messageInterval)); messages.length > 1 ? sessionDurationMs / (messages.length - 1) : 0;
message.timestamp = new Date(
startTime.getTime() + index * messageInterval
);
} }
}); });
} else { } else {
// Distribute messages across session duration // Distribute messages across session duration
const sessionDurationMs = endTime.getTime() - startTime.getTime(); const sessionDurationMs = endTime.getTime() - startTime.getTime();
const messageInterval = messages.length > 1 ? sessionDurationMs / (messages.length - 1) : 0; const messageInterval =
messages.length > 1 ? sessionDurationMs / (messages.length - 1) : 0;
messages.forEach((message, index) => { messages.forEach((message, index) => {
message.timestamp = new Date(startTime.getTime() + (index * messageInterval)); message.timestamp = new Date(
startTime.getTime() + index * messageInterval
);
}); });
} }
return { return {
success: true, success: true,
messages messages,
}; };
} catch (error) { } catch (error) {
return { return {
success: false, success: false,
error: error instanceof Error ? error.message : String(error) error: error instanceof Error ? error.message : String(error),
}; };
} }
} }
@ -198,17 +222,17 @@ export async function storeMessagesForSession(
): Promise<void> { ): Promise<void> {
// Delete existing messages for this session (in case of re-processing) // Delete existing messages for this session (in case of re-processing)
await prisma.message.deleteMany({ await prisma.message.deleteMany({
where: { sessionId } where: { sessionId },
}); });
// Create new messages // Create new messages
const messagesWithSessionId = messages.map(msg => ({ const messagesWithSessionId = messages.map((msg) => ({
...msg, ...msg,
sessionId sessionId,
})); }));
await prisma.message.createMany({ await prisma.message.createMany({
data: messagesWithSessionId data: messagesWithSessionId,
}); });
} }
@ -216,13 +240,15 @@ export async function storeMessagesForSession(
* Process transcript for a single session * Process transcript for a single session
* @param sessionId The session ID to process * @param sessionId The session ID to process
*/ */
export async function processSessionTranscript(sessionId: string): Promise<void> { export async function processSessionTranscript(
sessionId: string
): Promise<void> {
// Get the session and its import data // Get the session and its import data
const session = await prisma.session.findUnique({ const session = await prisma.session.findUnique({
where: { id: sessionId }, where: { id: sessionId },
include: { include: {
import: true import: true,
} },
}); });
if (!session) { if (!session) {
@ -255,35 +281,37 @@ export async function processSessionTranscript(sessionId: string): Promise<void>
// Store the messages // Store the messages
await storeMessagesForSession(sessionId, parseResult.messages!); await storeMessagesForSession(sessionId, parseResult.messages!);
console.log(`✅ Processed ${parseResult.messages!.length} messages for session ${sessionId}`); console.log(
`✅ Processed ${parseResult.messages!.length} messages for session ${sessionId}`
);
} }
/** /**
* Process all sessions that have transcript content but no messages * Process all sessions that have transcript content but no messages
*/ */
export async function processAllUnparsedTranscripts(): Promise<void> { export async function processAllUnparsedTranscripts(): Promise<void> {
console.log('🔍 Finding sessions with unparsed transcripts...'); console.log("🔍 Finding sessions with unparsed transcripts...");
// Find sessions that have transcript content but no messages // Find sessions that have transcript content but no messages
const sessionsToProcess = await prisma.session.findMany({ const sessionsToProcess = await prisma.session.findMany({
where: { where: {
import: { import: {
rawTranscriptContent: { rawTranscriptContent: {
not: null not: null,
} },
}, },
messages: { messages: {
none: {} none: {},
} },
}, },
include: { include: {
import: true, import: true,
_count: { _count: {
select: { select: {
messages: true messages: true,
} },
} },
} },
}); });
console.log(`📋 Found ${sessionsToProcess.length} sessions to process`); console.log(`📋 Found ${sessionsToProcess.length} sessions to process`);
@ -323,7 +351,7 @@ export async function getTotalMessageCount(): Promise<number> {
export async function getMessagesForSession(sessionId: string) { export async function getMessagesForSession(sessionId: string) {
return await prisma.message.findMany({ return await prisma.message.findMany({
where: { sessionId }, where: { sessionId },
orderBy: { order: 'asc' } orderBy: { order: "asc" },
}); });
} }
@ -336,17 +364,17 @@ export async function getParsingStats() {
where: { where: {
import: { import: {
rawTranscriptContent: { rawTranscriptContent: {
not: null not: null,
} },
} },
} },
}); });
const sessionsWithMessages = await prisma.session.count({ const sessionsWithMessages = await prisma.session.count({
where: { where: {
messages: { messages: {
some: {} some: {},
} },
} },
}); });
const totalMessages = await getTotalMessageCount(); const totalMessages = await getTotalMessageCount();
@ -355,6 +383,6 @@ export async function getParsingStats() {
sessionsWithTranscripts, sessionsWithTranscripts,
sessionsWithMessages, sessionsWithMessages,
unparsedSessions: sessionsWithTranscripts - sessionsWithMessages, unparsedSessions: sessionsWithTranscripts - sessionsWithMessages,
totalMessages totalMessages,
}; };
} }

View File

@ -1,6 +1,6 @@
import { clsx, type ClassValue } from "clsx" import { clsx, type ClassValue } from "clsx";
import { twMerge } from "tailwind-merge" import { twMerge } from "tailwind-merge";
export function cn(...inputs: ClassValue[]) { export function cn(...inputs: ClassValue[]) {
return twMerge(clsx(inputs)) return twMerge(clsx(inputs));
} }

134
lib/validation.ts Normal file
View File

@ -0,0 +1,134 @@
import { z } from "zod";
// Password validation with strong requirements
const passwordSchema = z
.string()
.min(12, "Password must be at least 12 characters long")
.regex(/^(?=.*[a-z])/, "Password must contain at least one lowercase letter")
.regex(/^(?=.*[A-Z])/, "Password must contain at least one uppercase letter")
.regex(/^(?=.*\d)/, "Password must contain at least one number")
.regex(
/^(?=.*[@$!%*?&])/,
"Password must contain at least one special character (@$!%*?&)"
);
// Email validation
const emailSchema = z
.string()
.email("Invalid email format")
.max(255, "Email must be less than 255 characters")
.toLowerCase();
// Company name validation
const companyNameSchema = z
.string()
.min(1, "Company name is required")
.max(100, "Company name must be less than 100 characters")
.regex(/^[a-zA-Z0-9\s\-_.]+$/, "Company name contains invalid characters");
// User registration schema
export const registerSchema = z.object({
email: emailSchema,
password: passwordSchema,
company: companyNameSchema,
});
// User login schema
export const loginSchema = z.object({
email: emailSchema,
password: z.string().min(1, "Password is required"),
});
// Password reset request schema
export const forgotPasswordSchema = z.object({
email: emailSchema,
});
// Password reset schema
export const resetPasswordSchema = z.object({
token: z.string().min(1, "Reset token is required"),
password: passwordSchema,
});
// Session filter schema
export const sessionFilterSchema = z.object({
search: z.string().max(100).optional(),
sentiment: z.enum(["POSITIVE", "NEUTRAL", "NEGATIVE"]).optional(),
category: z
.enum([
"SCHEDULE_HOURS",
"LEAVE_VACATION",
"SICK_LEAVE_RECOVERY",
"SALARY_COMPENSATION",
"CONTRACT_HOURS",
"ONBOARDING",
"OFFBOARDING",
"WORKWEAR_STAFF_PASS",
"TEAM_CONTACTS",
"PERSONAL_QUESTIONS",
"ACCESS_LOGIN",
"SOCIAL_QUESTIONS",
"UNRECOGNIZED_OTHER",
])
.optional(),
startDate: z.string().datetime().optional(),
endDate: z.string().datetime().optional(),
page: z.number().int().min(1).default(1),
limit: z.number().int().min(1).max(100).default(20),
});
// Company settings schema
export const companySettingsSchema = z.object({
name: companyNameSchema,
csvUrl: z.string().url("Invalid CSV URL"),
csvUsername: z.string().max(100).optional(),
csvPassword: z.string().max(100).optional(),
sentimentAlert: z.number().min(0).max(1).optional(),
dashboardOpts: z.object({}).passthrough().optional(),
});
// User management schema
export const userUpdateSchema = z.object({
email: emailSchema.optional(),
role: z.enum(["ADMIN", "USER", "AUDITOR"]).optional(),
password: passwordSchema.optional(),
});
// Metrics query schema
export const metricsQuerySchema = z.object({
startDate: z.string().datetime().optional(),
endDate: z.string().datetime().optional(),
companyId: z.string().uuid().optional(),
});
// Helper function to validate and sanitize input
export function validateInput<T>(
schema: z.ZodSchema<T>,
data: unknown
): { success: true; data: T } | { success: false; errors: string[] } {
try {
const result = schema.parse(data);
return { success: true, data: result };
} catch (error) {
if (error instanceof z.ZodError) {
const errors = error.errors.map(
(err) => `${err.path.join(".")}: ${err.message}`
);
return { success: false, errors };
}
return { success: false, errors: ["Invalid input"] };
}
}
// Rate limiting helper types
export interface RateLimitConfig {
windowMs: number;
maxRequests: number;
skipSuccessfulRequests?: boolean;
}
export const rateLimitConfigs = {
auth: { windowMs: 15 * 60 * 1000, maxRequests: 5 }, // 5 requests per 15 minutes
registration: { windowMs: 60 * 60 * 1000, maxRequests: 3 }, // 3 registrations per hour
api: { windowMs: 15 * 60 * 1000, maxRequests: 100 }, // 100 API requests per 15 minutes
} as const;

View File

@ -1,11 +1,15 @@
import { PrismaClient, ProcessingStage, ProcessingStatus } from '@prisma/client'; import {
import { ProcessingStatusManager } from './lib/processingStatusManager'; PrismaClient,
ProcessingStage,
ProcessingStatus,
} from "@prisma/client";
import { ProcessingStatusManager } from "./lib/processingStatusManager";
const prisma = new PrismaClient(); const prisma = new PrismaClient();
async function migrateToRefactoredSystem() { async function migrateToRefactoredSystem() {
try { try {
console.log('=== MIGRATING TO REFACTORED PROCESSING SYSTEM ===\n'); console.log("=== MIGRATING TO REFACTORED PROCESSING SYSTEM ===\n");
// Get all existing sessions // Get all existing sessions
const sessions = await prisma.session.findMany({ const sessions = await prisma.session.findMany({
@ -14,7 +18,7 @@ async function migrateToRefactoredSystem() {
messages: true, messages: true,
sessionQuestions: true, sessionQuestions: true,
}, },
orderBy: { createdAt: 'asc' } orderBy: { createdAt: "asc" },
}); });
console.log(`Found ${sessions.length} sessions to migrate...\n`); console.log(`Found ${sessions.length} sessions to migrate...\n`);
@ -22,7 +26,9 @@ async function migrateToRefactoredSystem() {
let migratedCount = 0; let migratedCount = 0;
for (const session of sessions) { for (const session of sessions) {
console.log(`Migrating session ${session.import?.externalSessionId || session.id}...`); console.log(
`Migrating session ${session.import?.externalSessionId || session.id}...`
);
// Initialize processing status for this session // Initialize processing status for this session
await ProcessingStatusManager.initializeSession(session.id); await ProcessingStatusManager.initializeSession(session.id);
@ -30,85 +36,135 @@ async function migrateToRefactoredSystem() {
// Determine the current state of each stage based on existing data // Determine the current state of each stage based on existing data
// 1. CSV_IMPORT - Always completed if session exists // 1. CSV_IMPORT - Always completed if session exists
await ProcessingStatusManager.completeStage(session.id, ProcessingStage.CSV_IMPORT, { await ProcessingStatusManager.completeStage(
migratedFrom: 'existing_session', session.id,
importId: session.importId ProcessingStage.CSV_IMPORT,
}); {
migratedFrom: "existing_session",
importId: session.importId,
}
);
// 2. TRANSCRIPT_FETCH - Check if transcript content exists // 2. TRANSCRIPT_FETCH - Check if transcript content exists
if (session.import?.rawTranscriptContent) { if (session.import?.rawTranscriptContent) {
await ProcessingStatusManager.completeStage(session.id, ProcessingStage.TRANSCRIPT_FETCH, { await ProcessingStatusManager.completeStage(
migratedFrom: 'existing_transcript', session.id,
contentLength: session.import.rawTranscriptContent.length ProcessingStage.TRANSCRIPT_FETCH,
}); {
migratedFrom: "existing_transcript",
contentLength: session.import.rawTranscriptContent.length,
}
);
} else if (!session.import?.fullTranscriptUrl) { } else if (!session.import?.fullTranscriptUrl) {
// No transcript URL - skip this stage // No transcript URL - skip this stage
await ProcessingStatusManager.skipStage(session.id, ProcessingStage.TRANSCRIPT_FETCH, 'No transcript URL in original import'); await ProcessingStatusManager.skipStage(
session.id,
ProcessingStage.TRANSCRIPT_FETCH,
"No transcript URL in original import"
);
} else { } else {
// Has URL but no content - mark as pending for retry // Has URL but no content - mark as pending for retry
console.log(` - Transcript fetch pending for ${session.import.externalSessionId}`); console.log(
` - Transcript fetch pending for ${session.import.externalSessionId}`
);
} }
// 3. SESSION_CREATION - Check if messages exist // 3. SESSION_CREATION - Check if messages exist
if (session.messages.length > 0) { if (session.messages.length > 0) {
await ProcessingStatusManager.completeStage(session.id, ProcessingStage.SESSION_CREATION, { await ProcessingStatusManager.completeStage(
migratedFrom: 'existing_messages', session.id,
messageCount: session.messages.length ProcessingStage.SESSION_CREATION,
}); {
migratedFrom: "existing_messages",
messageCount: session.messages.length,
}
);
} else if (session.import?.rawTranscriptContent) { } else if (session.import?.rawTranscriptContent) {
// Has transcript but no messages - needs reprocessing // Has transcript but no messages - needs reprocessing
console.log(` - Session creation pending for ${session.import.externalSessionId} (has transcript but no messages)`); console.log(
` - Session creation pending for ${session.import.externalSessionId} (has transcript but no messages)`
);
} else { } else {
// No transcript content - skip or mark as pending based on transcript fetch status // No transcript content - skip or mark as pending based on transcript fetch status
if (!session.import?.fullTranscriptUrl) { if (!session.import?.fullTranscriptUrl) {
await ProcessingStatusManager.skipStage(session.id, ProcessingStage.SESSION_CREATION, 'No transcript content available'); await ProcessingStatusManager.skipStage(
session.id,
ProcessingStage.SESSION_CREATION,
"No transcript content available"
);
} }
} }
// 4. AI_ANALYSIS - Check if AI fields are populated // 4. AI_ANALYSIS - Check if AI fields are populated
const hasAIAnalysis = session.summary || session.sentiment || session.category || session.language; const hasAIAnalysis =
session.summary ||
session.sentiment ||
session.category ||
session.language;
if (hasAIAnalysis) { if (hasAIAnalysis) {
await ProcessingStatusManager.completeStage(session.id, ProcessingStage.AI_ANALYSIS, { await ProcessingStatusManager.completeStage(
migratedFrom: 'existing_ai_analysis', session.id,
ProcessingStage.AI_ANALYSIS,
{
migratedFrom: "existing_ai_analysis",
hasSummary: !!session.summary, hasSummary: !!session.summary,
hasSentiment: !!session.sentiment, hasSentiment: !!session.sentiment,
hasCategory: !!session.category, hasCategory: !!session.category,
hasLanguage: !!session.language hasLanguage: !!session.language,
}); }
);
} else { } else {
// No AI analysis - mark as pending if session creation is complete // No AI analysis - mark as pending if session creation is complete
if (session.messages.length > 0) { if (session.messages.length > 0) {
console.log(` - AI analysis pending for ${session.import?.externalSessionId}`); console.log(
` - AI analysis pending for ${session.import?.externalSessionId}`
);
} }
} }
// 5. QUESTION_EXTRACTION - Check if questions exist // 5. QUESTION_EXTRACTION - Check if questions exist
if (session.sessionQuestions.length > 0) { if (session.sessionQuestions.length > 0) {
await ProcessingStatusManager.completeStage(session.id, ProcessingStage.QUESTION_EXTRACTION, { await ProcessingStatusManager.completeStage(
migratedFrom: 'existing_questions', session.id,
questionCount: session.sessionQuestions.length ProcessingStage.QUESTION_EXTRACTION,
}); {
migratedFrom: "existing_questions",
questionCount: session.sessionQuestions.length,
}
);
} else { } else {
// No questions - mark as pending if AI analysis is complete // No questions - mark as pending if AI analysis is complete
if (hasAIAnalysis) { if (hasAIAnalysis) {
console.log(` - Question extraction pending for ${session.import?.externalSessionId}`); console.log(
` - Question extraction pending for ${session.import?.externalSessionId}`
);
} }
} }
migratedCount++; migratedCount++;
if (migratedCount % 10 === 0) { if (migratedCount % 10 === 0) {
console.log(` Migrated ${migratedCount}/${sessions.length} sessions...`); console.log(
` Migrated ${migratedCount}/${sessions.length} sessions...`
);
} }
} }
console.log(`\n✓ Successfully migrated ${migratedCount} sessions to the new processing system`); console.log(
`\n✓ Successfully migrated ${migratedCount} sessions to the new processing system`
);
// Show final status // Show final status
console.log('\n=== MIGRATION COMPLETE - FINAL STATUS ==='); console.log("\n=== MIGRATION COMPLETE - FINAL STATUS ===");
const pipelineStatus = await ProcessingStatusManager.getPipelineStatus(); const pipelineStatus = await ProcessingStatusManager.getPipelineStatus();
const stages = ['CSV_IMPORT', 'TRANSCRIPT_FETCH', 'SESSION_CREATION', 'AI_ANALYSIS', 'QUESTION_EXTRACTION']; const stages = [
"CSV_IMPORT",
"TRANSCRIPT_FETCH",
"SESSION_CREATION",
"AI_ANALYSIS",
"QUESTION_EXTRACTION",
];
for (const stage of stages) { for (const stage of stages) {
const stageData = pipelineStatus.pipeline[stage] || {}; const stageData = pipelineStatus.pipeline[stage] || {};
@ -116,11 +172,12 @@ async function migrateToRefactoredSystem() {
const completed = stageData.COMPLETED || 0; const completed = stageData.COMPLETED || 0;
const skipped = stageData.SKIPPED || 0; const skipped = stageData.SKIPPED || 0;
console.log(`${stage}: ${completed} completed, ${pending} pending, ${skipped} skipped`); console.log(
`${stage}: ${completed} completed, ${pending} pending, ${skipped} skipped`
);
} }
} catch (error) { } catch (error) {
console.error('Error migrating to refactored system:', error); console.error("Error migrating to refactored system:", error);
} finally { } finally {
await prisma.$disconnect(); await prisma.$disconnect();
} }

View File

@ -4,10 +4,7 @@
const nextConfig = { const nextConfig = {
reactStrictMode: true, reactStrictMode: true,
// Allow cross-origin requests from specific origins in development // Allow cross-origin requests from specific origins in development
allowedDevOrigins: [ allowedDevOrigins: ["localhost", "127.0.0.1"],
"localhost",
"127.0.0.1"
],
}; };
export default nextConfig; export default nextConfig;

View File

@ -59,7 +59,8 @@
"react-markdown": "^10.1.0", "react-markdown": "^10.1.0",
"recharts": "^3.0.2", "recharts": "^3.0.2",
"rehype-raw": "^7.0.0", "rehype-raw": "^7.0.0",
"tailwind-merge": "^3.3.1" "tailwind-merge": "^3.3.1",
"zod": "^3.25.67"
}, },
"devDependencies": { "devDependencies": {
"@eslint/eslintrc": "^3.3.1", "@eslint/eslintrc": "^3.3.1",
@ -68,7 +69,6 @@
"@tailwindcss/postcss": "^4.1.11", "@tailwindcss/postcss": "^4.1.11",
"@testing-library/dom": "^10.4.0", "@testing-library/dom": "^10.4.0",
"@testing-library/react": "^16.3.0", "@testing-library/react": "^16.3.0",
"@types/bcryptjs": "^3.0.0",
"@types/node": "^24.0.6", "@types/node": "^24.0.6",
"@types/node-cron": "^3.0.11", "@types/node-cron": "^3.0.11",
"@types/react": "^19.1.8", "@types/react": "^19.1.8",

8605
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@ -9,9 +9,242 @@ datasource db {
directUrl = env("DATABASE_URL_DIRECT") directUrl = env("DATABASE_URL_DIRECT")
} }
/** /// *
* ENUMS fewer magic strings /// * COMPANY (multi-tenant root)
*/ model Company {
id String @id @default(uuid())
name String
csvUrl String
csvUsername String?
csvPassword String?
sentimentAlert Float?
dashboardOpts Json?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
companyAiModels CompanyAIModel[]
sessions Session[]
imports SessionImport[]
users User[] @relation("CompanyUsers")
}
/// *
/// * USER (auth accounts)
model User {
id String @id @default(uuid())
email String @unique
password String
role UserRole @default(USER)
companyId String
resetToken String?
resetTokenExpiry DateTime?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
company Company @relation("CompanyUsers", fields: [companyId], references: [id], onDelete: Cascade)
}
/// *
/// * 1. Normalised session ---------------------------
model Session {
id String @id @default(uuid())
companyId String
importId String? @unique
/// *
/// * session-level data (processed from SessionImport)
startTime DateTime
endTime DateTime
ipAddress String?
country String?
fullTranscriptUrl String?
avgResponseTime Float?
initialMsg String?
language String?
messagesSent Int?
sentiment SentimentCategory?
escalated Boolean?
forwardedHr Boolean?
category SessionCategory?
summary String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
aiProcessingRequests AIProcessingRequest[]
messages Message[]
company Company @relation(fields: [companyId], references: [id], onDelete: Cascade)
import SessionImport? @relation("ImportToSession", fields: [importId], references: [id])
processingStatus SessionProcessingStatus[]
sessionQuestions SessionQuestion[]
@@index([companyId, startTime])
}
/// *
/// * 2. Raw CSV row (pure data storage) ----------
model SessionImport {
id String @id @default(uuid())
companyId String
externalSessionId String @unique
startTimeRaw String
endTimeRaw String
ipAddress String?
countryCode String?
language String?
messagesSent Int?
sentimentRaw String?
escalatedRaw String?
forwardedHrRaw String?
fullTranscriptUrl String?
avgResponseTimeSeconds Float?
tokens Int?
tokensEur Float?
category String?
initialMessage String?
rawTranscriptContent String?
createdAt DateTime @default(now())
session Session? @relation("ImportToSession")
company Company @relation(fields: [companyId], references: [id], onDelete: Cascade)
@@unique([companyId, externalSessionId])
}
/// *
/// * MESSAGE (individual lines)
model Message {
id String @id @default(uuid())
sessionId String
timestamp DateTime?
role String
content String
order Int
createdAt DateTime @default(now())
session Session @relation(fields: [sessionId], references: [id], onDelete: Cascade)
@@unique([sessionId, order])
@@index([sessionId, order])
}
/// *
/// * UNIFIED PROCESSING STATUS TRACKING
model SessionProcessingStatus {
id String @id @default(uuid())
sessionId String
stage ProcessingStage
status ProcessingStatus @default(PENDING)
startedAt DateTime?
completedAt DateTime?
errorMessage String?
retryCount Int @default(0)
metadata Json?
session Session @relation(fields: [sessionId], references: [id], onDelete: Cascade)
@@unique([sessionId, stage])
@@index([stage, status])
@@index([sessionId])
}
/// *
/// * QUESTION MANAGEMENT (separate from Session for better analytics)
model Question {
id String @id @default(uuid())
content String @unique
createdAt DateTime @default(now())
sessionQuestions SessionQuestion[]
}
model SessionQuestion {
id String @id @default(uuid())
sessionId String
questionId String
order Int
createdAt DateTime @default(now())
question Question @relation(fields: [questionId], references: [id])
session Session @relation(fields: [sessionId], references: [id], onDelete: Cascade)
@@unique([sessionId, questionId])
@@unique([sessionId, order])
@@index([sessionId])
}
/// *
/// * AI PROCESSING COST TRACKING
model AIProcessingRequest {
id String @id @default(uuid())
sessionId String
openaiRequestId String?
model String
serviceTier String?
systemFingerprint String?
promptTokens Int
completionTokens Int
totalTokens Int
cachedTokens Int?
audioTokensPrompt Int?
reasoningTokens Int?
audioTokensCompletion Int?
acceptedPredictionTokens Int?
rejectedPredictionTokens Int?
promptTokenCost Float
completionTokenCost Float
totalCostEur Float
processingType String
success Boolean
errorMessage String?
requestedAt DateTime @default(now())
completedAt DateTime?
session Session @relation(fields: [sessionId], references: [id], onDelete: Cascade)
@@index([sessionId])
@@index([requestedAt])
@@index([model])
}
/// *
/// * AI Model definitions (without pricing)
model AIModel {
id String @id @default(uuid())
name String @unique
provider String
maxTokens Int?
isActive Boolean @default(true)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
pricing AIModelPricing[]
companyModels CompanyAIModel[]
@@index([provider, isActive])
}
/// *
/// * Time-based pricing for AI models
model AIModelPricing {
id String @id @default(uuid())
aiModelId String
promptTokenCost Float
completionTokenCost Float
effectiveFrom DateTime
effectiveUntil DateTime?
createdAt DateTime @default(now())
aiModel AIModel @relation(fields: [aiModelId], references: [id], onDelete: Cascade)
@@index([aiModelId, effectiveFrom])
@@index([effectiveFrom, effectiveUntil])
}
/// *
/// * Company-specific AI model assignments
model CompanyAIModel {
id String @id @default(uuid())
companyId String
aiModelId String
isDefault Boolean @default(false)
createdAt DateTime @default(now())
aiModel AIModel @relation(fields: [aiModelId], references: [id], onDelete: Cascade)
company Company @relation(fields: [companyId], references: [id], onDelete: Cascade)
@@unique([companyId, aiModelId])
@@index([companyId, isDefault])
}
/// *
/// * ENUMS fewer magic strings
enum UserRole { enum UserRole {
ADMIN ADMIN
USER USER
@ -41,11 +274,11 @@ enum SessionCategory {
} }
enum ProcessingStage { enum ProcessingStage {
CSV_IMPORT // SessionImport created CSV_IMPORT
TRANSCRIPT_FETCH // Transcript content fetched TRANSCRIPT_FETCH
SESSION_CREATION // Session + Messages created SESSION_CREATION
AI_ANALYSIS // AI processing completed AI_ANALYSIS
QUESTION_EXTRACTION // Questions extracted QUESTION_EXTRACTION
} }
enum ProcessingStatus { enum ProcessingStatus {
@ -55,322 +288,3 @@ enum ProcessingStatus {
FAILED FAILED
SKIPPED SKIPPED
} }
/**
* COMPANY (multi-tenant root)
*/
model Company {
id String @id @default(uuid())
name String
csvUrl String
csvUsername String?
csvPassword String?
sentimentAlert Float?
dashboardOpts Json? // JSON column instead of opaque string
users User[] @relation("CompanyUsers")
sessions Session[]
imports SessionImport[]
companyAiModels CompanyAIModel[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
/**
* USER (auth accounts)
*/
model User {
id String @id @default(uuid())
email String @unique
password String
role UserRole @default(USER)
company Company @relation("CompanyUsers", fields: [companyId], references: [id], onDelete: Cascade)
companyId String
resetToken String?
resetTokenExpiry DateTime?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
/**
* SESSION ↔ SESSIONIMPORT (1-to-1)
*/
/**
* 1. Normalised session ---------------------------
*/
model Session {
id String @id @default(uuid())
company Company @relation(fields: [companyId], references: [id], onDelete: Cascade)
companyId String
/**
* 1-to-1 link back to the import row
*/
import SessionImport? @relation("ImportToSession", fields: [importId], references: [id])
importId String? @unique
/**
* session-level data (processed from SessionImport)
*/
startTime DateTime
endTime DateTime
// Direct copies from SessionImport (minimal processing)
ipAddress String?
country String? // from countryCode
fullTranscriptUrl String?
avgResponseTime Float? // from avgResponseTimeSeconds
initialMsg String? // from initialMessage
// AI-processed fields (calculated from Messages or AI analysis)
language String? // AI-detected from Messages
messagesSent Int? // Calculated from Message count
sentiment SentimentCategory? // AI-analyzed (changed from Float to enum)
escalated Boolean? // AI-detected
forwardedHr Boolean? // AI-detected
category SessionCategory? // AI-categorized (changed to enum)
// AI-generated fields
summary String? // AI-generated summary
/**
* Relationships
*/
messages Message[] // Individual conversation messages
sessionQuestions SessionQuestion[] // Questions asked in this session
aiProcessingRequests AIProcessingRequest[] // AI processing cost tracking
processingStatus SessionProcessingStatus[] // Processing pipeline status
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([companyId, startTime])
}
/**
* 2. Raw CSV row (pure data storage) ----------
*/
model SessionImport {
id String @id @default(uuid())
company Company @relation(fields: [companyId], references: [id], onDelete: Cascade)
companyId String
/**
* 1-to-1 back-relation; NO fields/references here
*/
session Session? @relation("ImportToSession")
// ─── 16 CSV columns 1-to-1 ────────────────────────
externalSessionId String @unique // value from CSV column 1
startTimeRaw String
endTimeRaw String
ipAddress String?
countryCode String?
language String?
messagesSent Int?
sentimentRaw String?
escalatedRaw String?
forwardedHrRaw String?
fullTranscriptUrl String?
avgResponseTimeSeconds Float?
tokens Int?
tokensEur Float?
category String?
initialMessage String?
// ─── Raw transcript content ─────────────────────────
rawTranscriptContent String? // Fetched content from fullTranscriptUrl
// ─── bookkeeping ─────────────────────────────────
createdAt DateTime @default(now())
@@unique([companyId, externalSessionId]) // idempotent re-imports
}
/**
* MESSAGE (individual lines)
*/
model Message {
id String @id @default(uuid())
session Session @relation(fields: [sessionId], references: [id], onDelete: Cascade)
sessionId String
timestamp DateTime?
role String // "user" | "assistant" | "system" free-form keeps migration easy
content String
order Int
createdAt DateTime @default(now())
@@unique([sessionId, order]) // guards against duplicate order values
@@index([sessionId, order])
}
/**
* UNIFIED PROCESSING STATUS TRACKING
*/
model SessionProcessingStatus {
id String @id @default(uuid())
sessionId String
stage ProcessingStage
status ProcessingStatus @default(PENDING)
startedAt DateTime?
completedAt DateTime?
errorMessage String?
retryCount Int @default(0)
// Stage-specific metadata (e.g., AI costs, token usage, fetch details)
metadata Json?
session Session @relation(fields: [sessionId], references: [id], onDelete: Cascade)
@@unique([sessionId, stage])
@@index([stage, status])
@@index([sessionId])
}
/**
* QUESTION MANAGEMENT (separate from Session for better analytics)
*/
model Question {
id String @id @default(uuid())
content String @unique // The actual question text
createdAt DateTime @default(now())
// Relationships
sessionQuestions SessionQuestion[]
}
model SessionQuestion {
id String @id @default(uuid())
sessionId String
questionId String
order Int // Order within the session
createdAt DateTime @default(now())
// Relationships
session Session @relation(fields: [sessionId], references: [id], onDelete: Cascade)
question Question @relation(fields: [questionId], references: [id])
@@unique([sessionId, questionId]) // Prevent duplicate questions per session
@@unique([sessionId, order]) // Ensure unique ordering
@@index([sessionId])
}
/**
* AI PROCESSING COST TRACKING
*/
model AIProcessingRequest {
id String @id @default(uuid())
sessionId String
// OpenAI Request Details
openaiRequestId String? // "chatcmpl-Bn8IH9UM8t7luZVWnwZG7CVJ0kjPo"
model String // "gpt-4o-2024-08-06"
serviceTier String? // "default"
systemFingerprint String? // "fp_07871e2ad8"
// Token Usage (from usage object)
promptTokens Int // 11
completionTokens Int // 9
totalTokens Int // 20
// Detailed Token Breakdown
cachedTokens Int? // prompt_tokens_details.cached_tokens
audioTokensPrompt Int? // prompt_tokens_details.audio_tokens
reasoningTokens Int? // completion_tokens_details.reasoning_tokens
audioTokensCompletion Int? // completion_tokens_details.audio_tokens
acceptedPredictionTokens Int? // completion_tokens_details.accepted_prediction_tokens
rejectedPredictionTokens Int? // completion_tokens_details.rejected_prediction_tokens
// Cost Calculation
promptTokenCost Float // Cost per prompt token (varies by model)
completionTokenCost Float // Cost per completion token (varies by model)
totalCostEur Float // Calculated total cost in EUR
// Processing Context
processingType String // "session_analysis", "reprocessing", etc.
success Boolean // Whether the request succeeded
errorMessage String? // If failed, what went wrong
// Timestamps
requestedAt DateTime @default(now())
completedAt DateTime?
// Relationships
session Session @relation(fields: [sessionId], references: [id], onDelete: Cascade)
@@index([sessionId])
@@index([requestedAt])
@@index([model])
}
/**
* AI MODEL MANAGEMENT SYSTEM
*/
/**
* AI Model definitions (without pricing)
*/
model AIModel {
id String @id @default(uuid())
name String @unique // "gpt-4o", "gpt-4-turbo", etc.
provider String // "openai", "anthropic", etc.
maxTokens Int? // Maximum tokens for this model
isActive Boolean @default(true)
// Relationships
pricing AIModelPricing[]
companyModels CompanyAIModel[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([provider, isActive])
}
/**
* Time-based pricing for AI models
*/
model AIModelPricing {
id String @id @default(uuid())
aiModelId String
promptTokenCost Float // Cost per prompt token in USD
completionTokenCost Float // Cost per completion token in USD
effectiveFrom DateTime // When this pricing becomes effective
effectiveUntil DateTime? // When this pricing expires (null = current)
// Relationships
aiModel AIModel @relation(fields: [aiModelId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
@@index([aiModelId, effectiveFrom])
@@index([effectiveFrom, effectiveUntil])
}
/**
* Company-specific AI model assignments
*/
model CompanyAIModel {
id String @id @default(uuid())
companyId String
aiModelId String
isDefault Boolean @default(false) // Is this the default model for the company?
// Relationships
company Company @relation(fields: [companyId], references: [id], onDelete: Cascade)
aiModel AIModel @relation(fields: [aiModelId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
@@unique([companyId, aiModelId]) // Prevent duplicate assignments
@@index([companyId, isDefault])
}

View File

@ -94,7 +94,7 @@ async function main() {
]; ];
for (const pricing of pricingData) { for (const pricing of pricingData) {
const model = createdModels.find(m => m.name === pricing.modelName); const model = createdModels.find((m) => m.name === pricing.modelName);
if (model) { if (model) {
await prisma.aIModelPricing.create({ await prisma.aIModelPricing.create({
data: { data: {
@ -110,7 +110,7 @@ async function main() {
} }
// Assign default AI model to company (gpt-4o) // Assign default AI model to company (gpt-4o)
const defaultModel = createdModels.find(m => m.name === "gpt-4o"); const defaultModel = createdModels.find((m) => m.name === "gpt-4o");
if (defaultModel) { if (defaultModel) {
await prisma.companyAIModel.create({ await prisma.companyAIModel.create({
data: { data: {
@ -127,10 +127,11 @@ async function main() {
console.log(`Company: ${company.name}`); console.log(`Company: ${company.name}`);
console.log(`Admin user: ${adminUser.email}`); console.log(`Admin user: ${adminUser.email}`);
console.log(`Password: 8QbL26tB7fWS`); console.log(`Password: 8QbL26tB7fWS`);
console.log(`AI Models: ${createdModels.length} models created with current pricing`); console.log(
`AI Models: ${createdModels.length} models created with current pricing`
);
console.log(`Default model: ${defaultModel?.name}`); console.log(`Default model: ${defaultModel?.name}`);
console.log("\n🚀 Ready to start importing CSV data!"); console.log("\n🚀 Ready to start importing CSV data!");
} catch (error) { } catch (error) {
console.error("❌ Error seeding database:", error); console.error("❌ Error seeding database:", error);
process.exit(1); process.exit(1);

View File

@ -22,7 +22,9 @@ async function fetchTranscriptContent(
}); });
if (!response.ok) { if (!response.ok) {
console.warn(`Failed to fetch transcript from ${url}: ${response.statusText}`); console.warn(
`Failed to fetch transcript from ${url}: ${response.statusText}`
);
return null; return null;
} }
@ -42,7 +44,7 @@ function parseTranscriptToMessages(transcriptContent: string): Array<{
content: string; content: string;
order: number; order: number;
}> { }> {
const lines = transcriptContent.split('\n').filter(line => line.trim()); const lines = transcriptContent.split("\n").filter((line) => line.trim());
const messages: Array<{ const messages: Array<{
timestamp: Date | null; timestamp: Date | null;
role: string; role: string;
@ -79,12 +81,12 @@ function parseTranscriptToMessages(transcriptContent: string): Array<{
} else { } else {
// If line doesn't match expected format, treat as content continuation // If line doesn't match expected format, treat as content continuation
if (messages.length > 0) { if (messages.length > 0) {
messages[messages.length - 1].content += '\n' + line; messages[messages.length - 1].content += "\n" + line;
} else { } else {
// First line doesn't match format, create a generic message // First line doesn't match format, create a generic message
messages.push({ messages.push({
timestamp: null, timestamp: null,
role: 'unknown', role: "unknown",
content: line, content: line,
order: order++, order: order++,
}); });
@ -120,7 +122,9 @@ async function fetchTranscriptsForSessions() {
return; return;
} }
console.log(`Found ${sessionsNeedingTranscripts.length} sessions that need transcript fetching.`); console.log(
`Found ${sessionsNeedingTranscripts.length} sessions that need transcript fetching.`
);
let successCount = 0; let successCount = 0;
let errorCount = 0; let errorCount = 0;
@ -153,7 +157,7 @@ async function fetchTranscriptsForSessions() {
// Create messages in database // Create messages in database
await prisma.message.createMany({ await prisma.message.createMany({
data: messages.map(msg => ({ data: messages.map((msg) => ({
sessionId: session.id, sessionId: session.id,
timestamp: msg.timestamp, timestamp: msg.timestamp,
role: msg.role, role: msg.role,
@ -162,10 +166,15 @@ async function fetchTranscriptsForSessions() {
})), })),
}); });
console.log(`Successfully fetched transcript for session ${session.id} (${messages.length} messages)`); console.log(
`Successfully fetched transcript for session ${session.id} (${messages.length} messages)`
);
successCount++; successCount++;
} catch (error) { } catch (error) {
console.error(`Error fetching transcript for session ${session.id}:`, error); console.error(
`Error fetching transcript for session ${session.id}:`,
error
);
errorCount++; errorCount++;
} }
} }

View File

@ -37,7 +37,9 @@ async function fetchTranscriptContent(
}); });
if (!response.ok) { if (!response.ok) {
console.warn(`Failed to fetch transcript from ${url}: ${response.statusText}`); console.warn(
`Failed to fetch transcript from ${url}: ${response.statusText}`
);
return null; return null;
} }
return await response.text(); return await response.text();
@ -140,10 +142,19 @@ async function processTranscriptWithOpenAI(
/** /**
* Validates the OpenAI response against our expected schema * Validates the OpenAI response against our expected schema
*/ */
function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData { function validateOpenAIResponse(
data: any
): asserts data is OpenAIProcessedData {
const requiredFields = [ const requiredFields = [
"language", "messages_sent", "sentiment", "escalated", "language",
"forwarded_hr", "category", "questions", "summary", "session_id" "messages_sent",
"sentiment",
"escalated",
"forwarded_hr",
"category",
"questions",
"summary",
"session_id",
]; ];
for (const field of requiredFields) { for (const field of requiredFields) {
@ -153,7 +164,9 @@ function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData
} }
if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) { if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) {
throw new Error("Invalid language format. Expected ISO 639-1 code (e.g., 'en')"); throw new Error(
"Invalid language format. Expected ISO 639-1 code (e.g., 'en')"
);
} }
if (typeof data.messages_sent !== "number" || data.messages_sent < 0) { if (typeof data.messages_sent !== "number" || data.messages_sent < 0) {
@ -161,7 +174,9 @@ function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData
} }
if (!["positive", "neutral", "negative"].includes(data.sentiment)) { if (!["positive", "neutral", "negative"].includes(data.sentiment)) {
throw new Error("Invalid sentiment. Expected 'positive', 'neutral', or 'negative'"); throw new Error(
"Invalid sentiment. Expected 'positive', 'neutral', or 'negative'"
);
} }
if (typeof data.escalated !== "boolean") { if (typeof data.escalated !== "boolean") {
@ -173,22 +188,39 @@ function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData
} }
const validCategories = [ const validCategories = [
"Schedule & Hours", "Leave & Vacation", "Sick Leave & Recovery", "Schedule & Hours",
"Salary & Compensation", "Contract & Hours", "Onboarding", "Offboarding", "Leave & Vacation",
"Workwear & Staff Pass", "Team & Contacts", "Personal Questions", "Sick Leave & Recovery",
"Access & Login", "Social questions", "Unrecognized / Other" "Salary & Compensation",
"Contract & Hours",
"Onboarding",
"Offboarding",
"Workwear & Staff Pass",
"Team & Contacts",
"Personal Questions",
"Access & Login",
"Social questions",
"Unrecognized / Other",
]; ];
if (!validCategories.includes(data.category)) { if (!validCategories.includes(data.category)) {
throw new Error(`Invalid category. Expected one of: ${validCategories.join(", ")}`); throw new Error(
`Invalid category. Expected one of: ${validCategories.join(", ")}`
);
} }
if (!Array.isArray(data.questions)) { if (!Array.isArray(data.questions)) {
throw new Error("Invalid questions. Expected array of strings"); throw new Error("Invalid questions. Expected array of strings");
} }
if (typeof data.summary !== "string" || data.summary.length < 10 || data.summary.length > 300) { if (
throw new Error("Invalid summary. Expected string between 10-300 characters"); typeof data.summary !== "string" ||
data.summary.length < 10 ||
data.summary.length > 300
) {
throw new Error(
"Invalid summary. Expected string between 10-300 characters"
);
} }
if (typeof data.session_id !== "string") { if (typeof data.session_id !== "string") {
@ -218,17 +250,23 @@ async function processUnprocessedSessions() {
return; return;
} }
console.log(`Found ${importsToProcess.length} SessionImport records to process.`); console.log(
`Found ${importsToProcess.length} SessionImport records to process.`
);
let successCount = 0; let successCount = 0;
let errorCount = 0; let errorCount = 0;
for (const importRecord of importsToProcess) { for (const importRecord of importsToProcess) {
if (!importRecord.fullTranscriptUrl) { if (!importRecord.fullTranscriptUrl) {
console.warn(`SessionImport ${importRecord.id} has no transcript URL, skipping.`); console.warn(
`SessionImport ${importRecord.id} has no transcript URL, skipping.`
);
continue; continue;
} }
console.log(`Processing transcript for SessionImport ${importRecord.id}...`); console.log(
`Processing transcript for SessionImport ${importRecord.id}...`
);
try { try {
// Mark as processing (status field doesn't exist in new schema) // Mark as processing (status field doesn't exist in new schema)
@ -265,7 +303,10 @@ async function processUnprocessedSessions() {
country: importRecord.countryCode, country: importRecord.countryCode,
language: processedData.language, language: processedData.language,
messagesSent: processedData.messages_sent, messagesSent: processedData.messages_sent,
sentiment: processedData.sentiment.toUpperCase() as "POSITIVE" | "NEUTRAL" | "NEGATIVE", sentiment: processedData.sentiment.toUpperCase() as
| "POSITIVE"
| "NEUTRAL"
| "NEGATIVE",
escalated: processedData.escalated, escalated: processedData.escalated,
forwardedHr: processedData.forwarded_hr, forwardedHr: processedData.forwarded_hr,
fullTranscriptUrl: importRecord.fullTranscriptUrl, fullTranscriptUrl: importRecord.fullTranscriptUrl,
@ -284,7 +325,10 @@ async function processUnprocessedSessions() {
country: importRecord.countryCode, country: importRecord.countryCode,
language: processedData.language, language: processedData.language,
messagesSent: processedData.messages_sent, messagesSent: processedData.messages_sent,
sentiment: processedData.sentiment.toUpperCase() as "POSITIVE" | "NEUTRAL" | "NEGATIVE", sentiment: processedData.sentiment.toUpperCase() as
| "POSITIVE"
| "NEUTRAL"
| "NEGATIVE",
escalated: processedData.escalated, escalated: processedData.escalated,
forwardedHr: processedData.forwarded_hr, forwardedHr: processedData.forwarded_hr,
fullTranscriptUrl: importRecord.fullTranscriptUrl, fullTranscriptUrl: importRecord.fullTranscriptUrl,
@ -296,15 +340,24 @@ async function processUnprocessedSessions() {
}); });
// Mark SessionImport as processed (processedAt field doesn't exist in new schema) // Mark SessionImport as processed (processedAt field doesn't exist in new schema)
console.log(`Successfully processed SessionImport ${importRecord.id} -> Session ${session.id}`); console.log(
`Successfully processed SessionImport ${importRecord.id} -> Session ${session.id}`
);
console.log(`Successfully processed SessionImport ${importRecord.id} -> Session ${session.id}`); console.log(
`Successfully processed SessionImport ${importRecord.id} -> Session ${session.id}`
);
successCount++; successCount++;
} catch (error) { } catch (error) {
console.error(`Error processing SessionImport ${importRecord.id}:`, error); console.error(
`Error processing SessionImport ${importRecord.id}:`,
error
);
// Log error (status and errorMsg fields don't exist in new schema) // Log error (status and errorMsg fields don't exist in new schema)
console.error(`Failed to process SessionImport ${importRecord.id}: ${error instanceof Error ? error.message : String(error)}`); console.error(
`Failed to process SessionImport ${importRecord.id}: ${error instanceof Error ? error.message : String(error)}`
);
errorCount++; errorCount++;
} }

View File

@ -19,7 +19,7 @@ app.prepare().then(() => {
// Validate and log environment configuration // Validate and log environment configuration
const envValidation = validateEnv(); const envValidation = validateEnv();
if (!envValidation.valid) { if (!envValidation.valid) {
console.error('[Environment] Validation errors:', envValidation.errors); console.error("[Environment] Validation errors:", envValidation.errors);
} }
logEnvConfig(); logEnvConfig();

View File

@ -1,16 +1,15 @@
import { processUnprocessedSessions } from './lib/processingScheduler'; import { processUnprocessedSessions } from "./lib/processingScheduler";
async function testAIProcessing() { async function testAIProcessing() {
console.log('=== TESTING AI PROCESSING ===\n'); console.log("=== TESTING AI PROCESSING ===\n");
try { try {
// Process with batch size of 10 to test multiple batches (since we have 109 sessions) // Process with batch size of 10 to test multiple batches (since we have 109 sessions)
await processUnprocessedSessions(10, 3); // batch size 10, max concurrency 3 await processUnprocessedSessions(10, 3); // batch size 10, max concurrency 3
console.log('\n=== AI PROCESSING COMPLETED ==='); console.log("\n=== AI PROCESSING COMPLETED ===");
} catch (error) { } catch (error) {
console.error('Error during AI processing:', error); console.error("Error during AI processing:", error);
} }
} }

View File

@ -1,16 +1,15 @@
import { processQueuedImports } from './lib/importProcessor'; import { processQueuedImports } from "./lib/importProcessor";
async function testImportProcessing() { async function testImportProcessing() {
console.log('=== TESTING IMPORT PROCESSING ===\n'); console.log("=== TESTING IMPORT PROCESSING ===\n");
try { try {
// Process with batch size of 50 to test multiple batches // Process with batch size of 50 to test multiple batches
await processQueuedImports(50); await processQueuedImports(50);
console.log('\n=== IMPORT PROCESSING COMPLETED ==='); console.log("\n=== IMPORT PROCESSING COMPLETED ===");
} catch (error) { } catch (error) {
console.error('Error during import processing:', error); console.error("Error during import processing:", error);
} }
} }

View File

@ -1,49 +1,52 @@
// Test script for the refactored data processing pipeline // Test script for the refactored data processing pipeline
import { PrismaClient } from '@prisma/client'; import { PrismaClient } from "@prisma/client";
import { processQueuedImports } from './lib/importProcessor.ts'; import { processQueuedImports } from "./lib/importProcessor.ts";
import { processAllUnparsedTranscripts } from './lib/transcriptParser.ts'; import { processAllUnparsedTranscripts } from "./lib/transcriptParser.ts";
import { processUnprocessedSessions, getAIProcessingCosts } from './lib/processingScheduler.ts'; import {
processUnprocessedSessions,
getAIProcessingCosts,
} from "./lib/processingScheduler.ts";
const prisma = new PrismaClient(); const prisma = new PrismaClient();
async function testRefactoredPipeline() { async function testRefactoredPipeline() {
console.log('🧪 Testing Refactored Data Processing Pipeline\n'); console.log("🧪 Testing Refactored Data Processing Pipeline\n");
// Step 1: Check current state // Step 1: Check current state
console.log('📊 Current Database State:'); console.log("📊 Current Database State:");
const stats = await getDatabaseStats(); const stats = await getDatabaseStats();
console.log(stats); console.log(stats);
console.log(''); console.log("");
// Step 2: Test import processing (minimal fields only) // Step 2: Test import processing (minimal fields only)
console.log('🔄 Testing Import Processing (Phase 1)...'); console.log("🔄 Testing Import Processing (Phase 1)...");
await processQueuedImports(5); // Process 5 imports await processQueuedImports(5); // Process 5 imports
console.log(''); console.log("");
// Step 3: Test transcript parsing // Step 3: Test transcript parsing
console.log('📝 Testing Transcript Parsing (Phase 2)...'); console.log("📝 Testing Transcript Parsing (Phase 2)...");
await processAllUnparsedTranscripts(); await processAllUnparsedTranscripts();
console.log(''); console.log("");
// Step 4: Test AI processing with cost tracking // Step 4: Test AI processing with cost tracking
console.log('🤖 Testing AI Processing with Cost Tracking (Phase 3)...'); console.log("🤖 Testing AI Processing with Cost Tracking (Phase 3)...");
await processUnprocessedSessions(3, 2); // Process 3 sessions with concurrency 2 await processUnprocessedSessions(3, 2); // Process 3 sessions with concurrency 2
console.log(''); console.log("");
// Step 5: Show final results // Step 5: Show final results
console.log('📈 Final Results:'); console.log("📈 Final Results:");
const finalStats = await getDatabaseStats(); const finalStats = await getDatabaseStats();
console.log(finalStats); console.log(finalStats);
console.log(''); console.log("");
// Step 6: Show AI processing costs // Step 6: Show AI processing costs
console.log('💰 AI Processing Costs:'); console.log("💰 AI Processing Costs:");
const costs = await getAIProcessingCosts(); const costs = await getAIProcessingCosts();
console.log(costs); console.log(costs);
console.log(''); console.log("");
// Step 7: Show sample processed session // Step 7: Show sample processed session
console.log('🔍 Sample Processed Session:'); console.log("🔍 Sample Processed Session:");
const sampleSession = await getSampleProcessedSession(); const sampleSession = await getSampleProcessedSession();
if (sampleSession) { if (sampleSession) {
console.log(`Session ID: ${sampleSession.id}`); console.log(`Session ID: ${sampleSession.id}`);
@ -54,19 +57,23 @@ async function testRefactoredPipeline() {
console.log(`Escalated: ${sampleSession.escalated}`); console.log(`Escalated: ${sampleSession.escalated}`);
console.log(`Forwarded HR: ${sampleSession.forwardedHr}`); console.log(`Forwarded HR: ${sampleSession.forwardedHr}`);
console.log(`Summary: ${sampleSession.summary}`); console.log(`Summary: ${sampleSession.summary}`);
console.log(`Questions: ${sampleSession.sessionQuestions.length} questions`); console.log(
console.log(`AI Requests: ${sampleSession.aiProcessingRequests.length} requests`); `Questions: ${sampleSession.sessionQuestions.length} questions`
);
console.log(
`AI Requests: ${sampleSession.aiProcessingRequests.length} requests`
);
if (sampleSession.sessionQuestions.length > 0) { if (sampleSession.sessionQuestions.length > 0) {
console.log('Sample Questions:'); console.log("Sample Questions:");
sampleSession.sessionQuestions.slice(0, 3).forEach((sq, i) => { sampleSession.sessionQuestions.slice(0, 3).forEach((sq, i) => {
console.log(` ${i + 1}. ${sq.question.content}`); console.log(` ${i + 1}. ${sq.question.content}`);
}); });
} }
} }
console.log(''); console.log("");
console.log('✅ Pipeline test completed!'); console.log("✅ Pipeline test completed!");
} }
async function getDatabaseStats() { async function getDatabaseStats() {
@ -78,7 +85,7 @@ async function getDatabaseStats() {
totalMessages, totalMessages,
totalQuestions, totalQuestions,
totalSessionQuestions, totalSessionQuestions,
totalAIRequests totalAIRequests,
] = await Promise.all([ ] = await Promise.all([
prisma.session.count(), prisma.session.count(),
prisma.session.count({ where: { importId: { not: null } } }), prisma.session.count({ where: { importId: { not: null } } }),
@ -87,7 +94,7 @@ async function getDatabaseStats() {
prisma.message.count(), prisma.message.count(),
prisma.question.count(), prisma.question.count(),
prisma.sessionQuestion.count(), prisma.sessionQuestion.count(),
prisma.aIProcessingRequest.count() prisma.aIProcessingRequest.count(),
]); ]);
return { return {
@ -99,7 +106,7 @@ async function getDatabaseStats() {
totalMessages, totalMessages,
totalQuestions, totalQuestions,
totalSessionQuestions, totalSessionQuestions,
totalAIRequests totalAIRequests,
}; };
} }
@ -107,19 +114,19 @@ async function getSampleProcessedSession() {
return await prisma.session.findFirst({ return await prisma.session.findFirst({
where: { where: {
processed: true, processed: true,
messages: { some: {} } messages: { some: {} },
}, },
include: { include: {
sessionQuestions: { sessionQuestions: {
include: { include: {
question: true question: true,
}, },
orderBy: { order: 'asc' } orderBy: { order: "asc" },
}, },
aiProcessingRequests: { aiProcessingRequests: {
orderBy: { requestedAt: 'desc' } orderBy: { requestedAt: "desc" },
} },
} },
}); });
} }

View File

@ -1,5 +1,5 @@
// Vitest test setup // Vitest test setup
import { vi } from 'vitest'; import { vi } from "vitest";
// Mock console methods to reduce noise in tests // Mock console methods to reduce noise in tests
global.console = { global.console = {
@ -10,8 +10,8 @@ global.console = {
}; };
// Set test environment variables // Set test environment variables
process.env.NEXTAUTH_SECRET = 'test-secret'; process.env.NEXTAUTH_SECRET = "test-secret";
process.env.NEXTAUTH_URL = 'http://localhost:3000'; process.env.NEXTAUTH_URL = "http://localhost:3000";
// Use test database for all database operations during tests // Use test database for all database operations during tests
if (process.env.DATABASE_URL_TEST) { if (process.env.DATABASE_URL_TEST) {
@ -19,6 +19,6 @@ if (process.env.DATABASE_URL_TEST) {
} }
// Mock node-fetch for transcript fetcher tests // Mock node-fetch for transcript fetcher tests
vi.mock('node-fetch', () => ({ vi.mock("node-fetch", () => ({
default: vi.fn(), default: vi.fn(),
})); }));

View File

@ -1,7 +1,7 @@
import { describe, it, expect, beforeAll, afterAll } from 'vitest'; import { describe, it, expect, beforeAll, afterAll } from "vitest";
import { PrismaClient } from '@prisma/client'; import { PrismaClient } from "@prisma/client";
describe('Database Configuration', () => { describe("Database Configuration", () => {
let prisma: PrismaClient; let prisma: PrismaClient;
beforeAll(() => { beforeAll(() => {
@ -12,22 +12,22 @@ describe('Database Configuration', () => {
await prisma.$disconnect(); await prisma.$disconnect();
}); });
it('should connect to the test database', async () => { it("should connect to the test database", async () => {
// Verify we can connect to the database // Verify we can connect to the database
const result = await prisma.$queryRaw`SELECT 1 as test`; const result = await prisma.$queryRaw`SELECT 1 as test`;
expect(result).toBeDefined(); expect(result).toBeDefined();
}); });
it('should use PostgreSQL as the database provider', async () => { it("should use PostgreSQL as the database provider", async () => {
// Query the database to verify it's PostgreSQL // Query the database to verify it's PostgreSQL
const result = await prisma.$queryRaw`SELECT version()` as any[]; const result = (await prisma.$queryRaw`SELECT version()`) as any[];
expect(result[0].version).toContain('PostgreSQL'); expect(result[0].version).toContain("PostgreSQL");
}); });
it('should be using the test database URL', () => { it("should be using the test database URL", () => {
// Verify that DATABASE_URL is set to the test database // Verify that DATABASE_URL is set to the test database
expect(process.env.DATABASE_URL).toBeDefined(); expect(process.env.DATABASE_URL).toBeDefined();
expect(process.env.DATABASE_URL).toContain('postgresql://'); expect(process.env.DATABASE_URL).toContain("postgresql://");
// If DATABASE_URL_TEST is set, DATABASE_URL should match it (from our test setup) // If DATABASE_URL_TEST is set, DATABASE_URL should match it (from our test setup)
if (process.env.DATABASE_URL_TEST) { if (process.env.DATABASE_URL_TEST) {
@ -35,39 +35,39 @@ describe('Database Configuration', () => {
} }
}); });
it('should have all required tables', async () => { it("should have all required tables", async () => {
// Verify all our tables exist // Verify all our tables exist
const tables = await prisma.$queryRaw` const tables = (await prisma.$queryRaw`
SELECT table_name SELECT table_name
FROM information_schema.tables FROM information_schema.tables
WHERE table_schema = 'public' WHERE table_schema = 'public'
AND table_type = 'BASE TABLE' AND table_type = 'BASE TABLE'
ORDER BY table_name ORDER BY table_name
` as any[]; `) as any[];
const tableNames = tables.map(t => t.table_name); const tableNames = tables.map((t) => t.table_name);
expect(tableNames).toContain('Company'); expect(tableNames).toContain("Company");
expect(tableNames).toContain('User'); expect(tableNames).toContain("User");
expect(tableNames).toContain('Session'); expect(tableNames).toContain("Session");
expect(tableNames).toContain('SessionImport'); expect(tableNames).toContain("SessionImport");
expect(tableNames).toContain('Message'); expect(tableNames).toContain("Message");
expect(tableNames).toContain('Question'); expect(tableNames).toContain("Question");
expect(tableNames).toContain('SessionQuestion'); expect(tableNames).toContain("SessionQuestion");
expect(tableNames).toContain('AIProcessingRequest'); expect(tableNames).toContain("AIProcessingRequest");
}); });
it('should be able to create and query data', async () => { it("should be able to create and query data", async () => {
// Test basic CRUD operations // Test basic CRUD operations
const company = await prisma.company.create({ const company = await prisma.company.create({
data: { data: {
name: 'Test Company', name: "Test Company",
csvUrl: 'https://example.com/test.csv', csvUrl: "https://example.com/test.csv",
}, },
}); });
expect(company.id).toBeDefined(); expect(company.id).toBeDefined();
expect(company.name).toBe('Test Company'); expect(company.name).toBe("Test Company");
// Clean up // Clean up
await prisma.company.delete({ await prisma.company.delete({

View File

@ -1,7 +1,7 @@
// Unit tests for environment management // Unit tests for environment management
import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest'; import { describe, it, expect, beforeEach, afterEach, vi } from "vitest";
describe('Environment Management', () => { describe("Environment Management", () => {
let originalEnv: NodeJS.ProcessEnv; let originalEnv: NodeJS.ProcessEnv;
beforeEach(() => { beforeEach(() => {
@ -15,8 +15,8 @@ describe('Environment Management', () => {
vi.resetModules(); vi.resetModules();
}); });
describe('env object', () => { describe("env object", () => {
it('should have default values when environment variables are not set', async () => { it("should have default values when environment variables are not set", async () => {
// Clear relevant env vars // Clear relevant env vars
delete process.env.NEXTAUTH_URL; delete process.env.NEXTAUTH_URL;
delete process.env.SCHEDULER_ENABLED; delete process.env.SCHEDULER_ENABLED;
@ -24,119 +24,121 @@ describe('Environment Management', () => {
// Re-import to get fresh env object // Re-import to get fresh env object
vi.resetModules(); vi.resetModules();
const { env: freshEnv } = await import('../../lib/env'); const { env: freshEnv } = await import("../../lib/env");
expect(freshEnv.NEXTAUTH_URL).toBe('http://localhost:3000'); expect(freshEnv.NEXTAUTH_URL).toBe("http://localhost:3000");
// Note: SCHEDULER_ENABLED will be true because .env.local sets it to "true" // Note: SCHEDULER_ENABLED will be true because .env.local sets it to "true"
expect(freshEnv.SCHEDULER_ENABLED).toBe(true); expect(freshEnv.SCHEDULER_ENABLED).toBe(true);
expect(freshEnv.PORT).toBe(3000); expect(freshEnv.PORT).toBe(3000);
}); });
it('should use environment variables when set', async () => { it("should use environment variables when set", async () => {
process.env.NEXTAUTH_URL = 'https://example.com'; process.env.NEXTAUTH_URL = "https://example.com";
process.env.SCHEDULER_ENABLED = 'true'; process.env.SCHEDULER_ENABLED = "true";
process.env.PORT = '8080'; process.env.PORT = "8080";
vi.resetModules(); vi.resetModules();
const { env: freshEnv } = await import('../../lib/env'); const { env: freshEnv } = await import("../../lib/env");
expect(freshEnv.NEXTAUTH_URL).toBe('https://example.com'); expect(freshEnv.NEXTAUTH_URL).toBe("https://example.com");
expect(freshEnv.SCHEDULER_ENABLED).toBe(true); expect(freshEnv.SCHEDULER_ENABLED).toBe(true);
expect(freshEnv.PORT).toBe(8080); expect(freshEnv.PORT).toBe(8080);
}); });
it('should parse numeric environment variables correctly', async () => { it("should parse numeric environment variables correctly", async () => {
process.env.IMPORT_PROCESSING_BATCH_SIZE = '100'; process.env.IMPORT_PROCESSING_BATCH_SIZE = "100";
process.env.SESSION_PROCESSING_CONCURRENCY = '10'; process.env.SESSION_PROCESSING_CONCURRENCY = "10";
vi.resetModules(); vi.resetModules();
const { env: freshEnv } = await import('../../lib/env'); const { env: freshEnv } = await import("../../lib/env");
expect(freshEnv.IMPORT_PROCESSING_BATCH_SIZE).toBe(100); expect(freshEnv.IMPORT_PROCESSING_BATCH_SIZE).toBe(100);
expect(freshEnv.SESSION_PROCESSING_CONCURRENCY).toBe(10); expect(freshEnv.SESSION_PROCESSING_CONCURRENCY).toBe(10);
}); });
it('should handle invalid numeric values gracefully', async () => { it("should handle invalid numeric values gracefully", async () => {
process.env.IMPORT_PROCESSING_BATCH_SIZE = 'invalid'; process.env.IMPORT_PROCESSING_BATCH_SIZE = "invalid";
process.env.SESSION_PROCESSING_CONCURRENCY = ''; process.env.SESSION_PROCESSING_CONCURRENCY = "";
vi.resetModules(); vi.resetModules();
const { env: freshEnv } = await import('../../lib/env'); const { env: freshEnv } = await import("../../lib/env");
expect(freshEnv.IMPORT_PROCESSING_BATCH_SIZE).toBe(50); // Falls back to default value expect(freshEnv.IMPORT_PROCESSING_BATCH_SIZE).toBe(50); // Falls back to default value
expect(freshEnv.SESSION_PROCESSING_CONCURRENCY).toBe(5); // Falls back to default value expect(freshEnv.SESSION_PROCESSING_CONCURRENCY).toBe(5); // Falls back to default value
}); });
it('should parse quoted environment variables correctly', async () => { it("should parse quoted environment variables correctly", async () => {
process.env.NEXTAUTH_URL = '"https://quoted.example.com"'; process.env.NEXTAUTH_URL = '"https://quoted.example.com"';
process.env.NEXTAUTH_SECRET = "'single-quoted-secret'"; process.env.NEXTAUTH_SECRET = "'single-quoted-secret'";
vi.resetModules(); vi.resetModules();
const { env: freshEnv } = await import('../../lib/env'); const { env: freshEnv } = await import("../../lib/env");
expect(freshEnv.NEXTAUTH_URL).toBe('https://quoted.example.com'); expect(freshEnv.NEXTAUTH_URL).toBe("https://quoted.example.com");
expect(freshEnv.NEXTAUTH_SECRET).toBe('single-quoted-secret'); expect(freshEnv.NEXTAUTH_SECRET).toBe("single-quoted-secret");
}); });
it('should strip inline comments from environment variables', async () => { it("should strip inline comments from environment variables", async () => {
process.env.CSV_IMPORT_INTERVAL = '*/10 * * * * # Custom comment'; process.env.CSV_IMPORT_INTERVAL = "*/10 * * * * # Custom comment";
process.env.IMPORT_PROCESSING_INTERVAL = '*/3 * * * * # Another comment'; process.env.IMPORT_PROCESSING_INTERVAL =
"*/3 * * * * # Another comment";
vi.resetModules(); vi.resetModules();
const { env: freshEnv } = await import('../../lib/env'); const { env: freshEnv } = await import("../../lib/env");
expect(freshEnv.CSV_IMPORT_INTERVAL).toBe('*/10 * * * *'); expect(freshEnv.CSV_IMPORT_INTERVAL).toBe("*/10 * * * *");
expect(freshEnv.IMPORT_PROCESSING_INTERVAL).toBe('*/3 * * * *'); expect(freshEnv.IMPORT_PROCESSING_INTERVAL).toBe("*/3 * * * *");
}); });
it('should handle whitespace around environment variables', async () => { it("should handle whitespace around environment variables", async () => {
process.env.NEXTAUTH_URL = ' https://spaced.example.com '; process.env.NEXTAUTH_URL = " https://spaced.example.com ";
process.env.PORT = ' 8080 '; process.env.PORT = " 8080 ";
vi.resetModules(); vi.resetModules();
const { env: freshEnv } = await import('../../lib/env'); const { env: freshEnv } = await import("../../lib/env");
expect(freshEnv.NEXTAUTH_URL).toBe('https://spaced.example.com'); expect(freshEnv.NEXTAUTH_URL).toBe("https://spaced.example.com");
expect(freshEnv.PORT).toBe(8080); expect(freshEnv.PORT).toBe(8080);
}); });
it('should handle complex combinations of quotes, comments, and whitespace', async () => { it("should handle complex combinations of quotes, comments, and whitespace", async () => {
process.env.NEXTAUTH_URL = ' "https://complex.example.com" # Production URL'; process.env.NEXTAUTH_URL =
' "https://complex.example.com" # Production URL';
process.env.IMPORT_PROCESSING_BATCH_SIZE = " '100' # Batch size"; process.env.IMPORT_PROCESSING_BATCH_SIZE = " '100' # Batch size";
vi.resetModules(); vi.resetModules();
const { env: freshEnv } = await import('../../lib/env'); const { env: freshEnv } = await import("../../lib/env");
expect(freshEnv.NEXTAUTH_URL).toBe('https://complex.example.com'); expect(freshEnv.NEXTAUTH_URL).toBe("https://complex.example.com");
expect(freshEnv.IMPORT_PROCESSING_BATCH_SIZE).toBe(100); expect(freshEnv.IMPORT_PROCESSING_BATCH_SIZE).toBe(100);
}); });
}); });
describe('validateEnv', () => { describe("validateEnv", () => {
it('should return valid when all required variables are set', async () => { it("should return valid when all required variables are set", async () => {
vi.stubEnv('NEXTAUTH_SECRET', 'test-secret'); vi.stubEnv("NEXTAUTH_SECRET", "test-secret");
vi.stubEnv('OPENAI_API_KEY', 'test-key'); vi.stubEnv("OPENAI_API_KEY", "test-key");
vi.stubEnv('NODE_ENV', 'production'); vi.stubEnv("NODE_ENV", "production");
vi.resetModules(); vi.resetModules();
const { validateEnv: freshValidateEnv } = await import('../../lib/env'); const { validateEnv: freshValidateEnv } = await import("../../lib/env");
const result = freshValidateEnv(); const result = freshValidateEnv();
expect(result.valid).toBe(true); expect(result.valid).toBe(true);
expect(result.errors).toHaveLength(0); expect(result.errors).toHaveLength(0);
}); });
it('should return invalid when NEXTAUTH_SECRET is missing', async () => { it("should return invalid when NEXTAUTH_SECRET is missing", async () => {
// Test the validation logic by checking what happens with the current environment // Test the validation logic by checking what happens with the current environment
// Since .env.local provides values, we'll test the validation function directly // Since .env.local provides values, we'll test the validation function directly
const { validateEnv } = await import('../../lib/env'); const { validateEnv } = await import("../../lib/env");
// Mock the env object to simulate missing NEXTAUTH_SECRET // Mock the env object to simulate missing NEXTAUTH_SECRET
const originalEnv = process.env.NEXTAUTH_SECRET; const originalEnv = process.env.NEXTAUTH_SECRET;
delete process.env.NEXTAUTH_SECRET; delete process.env.NEXTAUTH_SECRET;
vi.resetModules(); vi.resetModules();
const { validateEnv: freshValidateEnv } = await import('../../lib/env'); const { validateEnv: freshValidateEnv } = await import("../../lib/env");
const result = freshValidateEnv(); const result = freshValidateEnv();
@ -150,10 +152,10 @@ describe('Environment Management', () => {
expect(result.valid).toBe(true); expect(result.valid).toBe(true);
}); });
it('should require OPENAI_API_KEY in production', async () => { it("should require OPENAI_API_KEY in production", async () => {
// Test the validation logic with production environment // Test the validation logic with production environment
// Since .env.local provides values, this test validates the current behavior // Since .env.local provides values, this test validates the current behavior
const { validateEnv } = await import('../../lib/env'); const { validateEnv } = await import("../../lib/env");
const result = validateEnv(); const result = validateEnv();
@ -162,13 +164,13 @@ describe('Environment Management', () => {
expect(result.valid).toBe(true); expect(result.valid).toBe(true);
}); });
it('should not require OPENAI_API_KEY in development', async () => { it("should not require OPENAI_API_KEY in development", async () => {
vi.stubEnv('NEXTAUTH_SECRET', 'test-secret'); vi.stubEnv("NEXTAUTH_SECRET", "test-secret");
vi.stubEnv('OPENAI_API_KEY', ''); vi.stubEnv("OPENAI_API_KEY", "");
vi.stubEnv('NODE_ENV', 'development'); vi.stubEnv("NODE_ENV", "development");
vi.resetModules(); vi.resetModules();
const { validateEnv: freshValidateEnv } = await import('../../lib/env'); const { validateEnv: freshValidateEnv } = await import("../../lib/env");
const result = freshValidateEnv(); const result = freshValidateEnv();
expect(result.valid).toBe(true); expect(result.valid).toBe(true);
@ -176,45 +178,49 @@ describe('Environment Management', () => {
}); });
}); });
describe('getSchedulerConfig', () => { describe("getSchedulerConfig", () => {
it('should return correct scheduler configuration', async () => { it("should return correct scheduler configuration", async () => {
process.env.SCHEDULER_ENABLED = 'true'; process.env.SCHEDULER_ENABLED = "true";
process.env.CSV_IMPORT_INTERVAL = '*/10 * * * *'; process.env.CSV_IMPORT_INTERVAL = "*/10 * * * *";
process.env.IMPORT_PROCESSING_INTERVAL = '*/3 * * * *'; process.env.IMPORT_PROCESSING_INTERVAL = "*/3 * * * *";
process.env.IMPORT_PROCESSING_BATCH_SIZE = '25'; process.env.IMPORT_PROCESSING_BATCH_SIZE = "25";
process.env.SESSION_PROCESSING_INTERVAL = '0 2 * * *'; process.env.SESSION_PROCESSING_INTERVAL = "0 2 * * *";
process.env.SESSION_PROCESSING_BATCH_SIZE = '100'; process.env.SESSION_PROCESSING_BATCH_SIZE = "100";
process.env.SESSION_PROCESSING_CONCURRENCY = '8'; process.env.SESSION_PROCESSING_CONCURRENCY = "8";
vi.resetModules(); vi.resetModules();
const { getSchedulerConfig: freshGetSchedulerConfig } = await import('../../lib/env'); const { getSchedulerConfig: freshGetSchedulerConfig } = await import(
"../../lib/env"
);
const config = freshGetSchedulerConfig(); const config = freshGetSchedulerConfig();
expect(config.enabled).toBe(true); expect(config.enabled).toBe(true);
expect(config.csvImport.interval).toBe('*/10 * * * *'); expect(config.csvImport.interval).toBe("*/10 * * * *");
expect(config.importProcessing.interval).toBe('*/3 * * * *'); expect(config.importProcessing.interval).toBe("*/3 * * * *");
expect(config.importProcessing.batchSize).toBe(25); expect(config.importProcessing.batchSize).toBe(25);
expect(config.sessionProcessing.interval).toBe('0 2 * * *'); expect(config.sessionProcessing.interval).toBe("0 2 * * *");
expect(config.sessionProcessing.batchSize).toBe(100); expect(config.sessionProcessing.batchSize).toBe(100);
expect(config.sessionProcessing.concurrency).toBe(8); expect(config.sessionProcessing.concurrency).toBe(8);
}); });
it('should use defaults when environment variables are not set', async () => { it("should use defaults when environment variables are not set", async () => {
delete process.env.SCHEDULER_ENABLED; delete process.env.SCHEDULER_ENABLED;
delete process.env.CSV_IMPORT_INTERVAL; delete process.env.CSV_IMPORT_INTERVAL;
delete process.env.IMPORT_PROCESSING_INTERVAL; delete process.env.IMPORT_PROCESSING_INTERVAL;
vi.resetModules(); vi.resetModules();
const { getSchedulerConfig: freshGetSchedulerConfig } = await import('../../lib/env'); const { getSchedulerConfig: freshGetSchedulerConfig } = await import(
"../../lib/env"
);
const config = freshGetSchedulerConfig(); const config = freshGetSchedulerConfig();
// Note: SCHEDULER_ENABLED will be true because .env.local sets it to "true" // Note: SCHEDULER_ENABLED will be true because .env.local sets it to "true"
expect(config.enabled).toBe(true); expect(config.enabled).toBe(true);
// The .env.local file is loaded and comments are now stripped, so we expect clean values // The .env.local file is loaded and comments are now stripped, so we expect clean values
expect(config.csvImport.interval).toBe('*/15 * * * *'); expect(config.csvImport.interval).toBe("*/15 * * * *");
expect(config.importProcessing.interval).toBe('*/5 * * * *'); expect(config.importProcessing.interval).toBe("*/5 * * * *");
expect(config.importProcessing.batchSize).toBe(50); expect(config.importProcessing.batchSize).toBe(50);
}); });
}); });

View File

@ -1,222 +1,237 @@
// Unit tests for transcript fetcher // Unit tests for transcript fetcher
import { describe, it, expect, beforeEach, vi } from 'vitest'; import { describe, it, expect, beforeEach, vi } from "vitest";
import fetch from 'node-fetch'; import fetch from "node-fetch";
import { import {
fetchTranscriptContent, fetchTranscriptContent,
isValidTranscriptUrl, isValidTranscriptUrl,
extractSessionIdFromTranscript extractSessionIdFromTranscript,
} from '../../lib/transcriptFetcher'; } from "../../lib/transcriptFetcher";
// Mock node-fetch // Mock node-fetch
const mockFetch = fetch as any; const mockFetch = fetch as any;
describe('Transcript Fetcher', () => { describe("Transcript Fetcher", () => {
beforeEach(() => { beforeEach(() => {
vi.clearAllMocks(); vi.clearAllMocks();
}); });
describe('fetchTranscriptContent', () => { describe("fetchTranscriptContent", () => {
it('should successfully fetch transcript content', async () => { it("should successfully fetch transcript content", async () => {
const mockResponse = { const mockResponse = {
ok: true, ok: true,
text: vi.fn().mockResolvedValue('Session transcript content'), text: vi.fn().mockResolvedValue("Session transcript content"),
};
mockFetch.mockResolvedValue(mockResponse as any);
const result = await fetchTranscriptContent('https://example.com/transcript');
expect(result.success).toBe(true);
expect(result.content).toBe('Session transcript content');
expect(result.error).toBeUndefined();
expect(mockFetch).toHaveBeenCalledWith('https://example.com/transcript', {
method: 'GET',
headers: {
'User-Agent': 'LiveDash-Transcript-Fetcher/1.0',
},
signal: expect.any(AbortSignal),
});
});
it('should handle authentication with username and password', async () => {
const mockResponse = {
ok: true,
text: vi.fn().mockResolvedValue('Authenticated transcript'),
}; };
mockFetch.mockResolvedValue(mockResponse as any); mockFetch.mockResolvedValue(mockResponse as any);
const result = await fetchTranscriptContent( const result = await fetchTranscriptContent(
'https://example.com/transcript', "https://example.com/transcript"
'user123',
'pass456'
); );
expect(result.success).toBe(true); expect(result.success).toBe(true);
expect(result.content).toBe('Authenticated transcript'); expect(result.content).toBe("Session transcript content");
expect(result.error).toBeUndefined();
const expectedAuth = 'Basic ' + Buffer.from('user123:pass456').toString('base64'); expect(mockFetch).toHaveBeenCalledWith("https://example.com/transcript", {
expect(mockFetch).toHaveBeenCalledWith('https://example.com/transcript', { method: "GET",
method: 'GET',
headers: { headers: {
'User-Agent': 'LiveDash-Transcript-Fetcher/1.0', "User-Agent": "LiveDash-Transcript-Fetcher/1.0",
'Authorization': expectedAuth,
}, },
signal: expect.any(AbortSignal), signal: expect.any(AbortSignal),
}); });
}); });
it('should handle HTTP errors', async () => { it("should handle authentication with username and password", async () => {
const mockResponse = {
ok: true,
text: vi.fn().mockResolvedValue("Authenticated transcript"),
};
mockFetch.mockResolvedValue(mockResponse as any);
const result = await fetchTranscriptContent(
"https://example.com/transcript",
"user123",
"pass456"
);
expect(result.success).toBe(true);
expect(result.content).toBe("Authenticated transcript");
const expectedAuth =
"Basic " + Buffer.from("user123:pass456").toString("base64");
expect(mockFetch).toHaveBeenCalledWith("https://example.com/transcript", {
method: "GET",
headers: {
"User-Agent": "LiveDash-Transcript-Fetcher/1.0",
Authorization: expectedAuth,
},
signal: expect.any(AbortSignal),
});
});
it("should handle HTTP errors", async () => {
const mockResponse = { const mockResponse = {
ok: false, ok: false,
status: 404, status: 404,
statusText: 'Not Found', statusText: "Not Found",
}; };
mockFetch.mockResolvedValue(mockResponse as any); mockFetch.mockResolvedValue(mockResponse as any);
const result = await fetchTranscriptContent('https://example.com/transcript'); const result = await fetchTranscriptContent(
"https://example.com/transcript"
);
expect(result.success).toBe(false); expect(result.success).toBe(false);
expect(result.error).toBe('HTTP 404: Not Found'); expect(result.error).toBe("HTTP 404: Not Found");
expect(result.content).toBeUndefined(); expect(result.content).toBeUndefined();
}); });
it('should handle empty transcript content', async () => { it("should handle empty transcript content", async () => {
const mockResponse = { const mockResponse = {
ok: true, ok: true,
text: vi.fn().mockResolvedValue(' '), text: vi.fn().mockResolvedValue(" "),
}; };
mockFetch.mockResolvedValue(mockResponse as any); mockFetch.mockResolvedValue(mockResponse as any);
const result = await fetchTranscriptContent('https://example.com/transcript'); const result = await fetchTranscriptContent(
"https://example.com/transcript"
);
expect(result.success).toBe(false); expect(result.success).toBe(false);
expect(result.error).toBe('Empty transcript content'); expect(result.error).toBe("Empty transcript content");
expect(result.content).toBeUndefined(); expect(result.content).toBeUndefined();
}); });
it('should handle network errors', async () => { it("should handle network errors", async () => {
mockFetch.mockRejectedValue(new Error('ENOTFOUND example.com')); mockFetch.mockRejectedValue(new Error("ENOTFOUND example.com"));
const result = await fetchTranscriptContent('https://example.com/transcript'); const result = await fetchTranscriptContent(
"https://example.com/transcript"
);
expect(result.success).toBe(false); expect(result.success).toBe(false);
expect(result.error).toBe('Domain not found'); expect(result.error).toBe("Domain not found");
expect(result.content).toBeUndefined(); expect(result.content).toBeUndefined();
}); });
it('should handle connection refused errors', async () => { it("should handle connection refused errors", async () => {
mockFetch.mockRejectedValue(new Error('ECONNREFUSED')); mockFetch.mockRejectedValue(new Error("ECONNREFUSED"));
const result = await fetchTranscriptContent('https://example.com/transcript'); const result = await fetchTranscriptContent(
"https://example.com/transcript"
);
expect(result.success).toBe(false); expect(result.success).toBe(false);
expect(result.error).toBe('Connection refused'); expect(result.error).toBe("Connection refused");
expect(result.content).toBeUndefined(); expect(result.content).toBeUndefined();
}); });
it('should handle timeout errors', async () => { it("should handle timeout errors", async () => {
mockFetch.mockRejectedValue(new Error('Request timeout')); mockFetch.mockRejectedValue(new Error("Request timeout"));
const result = await fetchTranscriptContent('https://example.com/transcript'); const result = await fetchTranscriptContent(
"https://example.com/transcript"
);
expect(result.success).toBe(false); expect(result.success).toBe(false);
expect(result.error).toBe('Request timeout'); expect(result.error).toBe("Request timeout");
expect(result.content).toBeUndefined(); expect(result.content).toBeUndefined();
}); });
it('should handle empty URL', async () => { it("should handle empty URL", async () => {
const result = await fetchTranscriptContent(''); const result = await fetchTranscriptContent("");
expect(result.success).toBe(false); expect(result.success).toBe(false);
expect(result.error).toBe('No transcript URL provided'); expect(result.error).toBe("No transcript URL provided");
expect(result.content).toBeUndefined(); expect(result.content).toBeUndefined();
expect(mockFetch).not.toHaveBeenCalled(); expect(mockFetch).not.toHaveBeenCalled();
}); });
it('should trim whitespace from content', async () => { it("should trim whitespace from content", async () => {
const mockResponse = { const mockResponse = {
ok: true, ok: true,
text: vi.fn().mockResolvedValue(' \n Session content \n '), text: vi.fn().mockResolvedValue(" \n Session content \n "),
}; };
mockFetch.mockResolvedValue(mockResponse as any); mockFetch.mockResolvedValue(mockResponse as any);
const result = await fetchTranscriptContent('https://example.com/transcript'); const result = await fetchTranscriptContent(
"https://example.com/transcript"
);
expect(result.success).toBe(true); expect(result.success).toBe(true);
expect(result.content).toBe('Session content'); expect(result.content).toBe("Session content");
}); });
}); });
describe('isValidTranscriptUrl', () => { describe("isValidTranscriptUrl", () => {
it('should validate HTTP URLs', () => { it("should validate HTTP URLs", () => {
expect(isValidTranscriptUrl('http://example.com/transcript')).toBe(true); expect(isValidTranscriptUrl("http://example.com/transcript")).toBe(true);
}); });
it('should validate HTTPS URLs', () => { it("should validate HTTPS URLs", () => {
expect(isValidTranscriptUrl('https://example.com/transcript')).toBe(true); expect(isValidTranscriptUrl("https://example.com/transcript")).toBe(true);
}); });
it('should reject invalid URLs', () => { it("should reject invalid URLs", () => {
expect(isValidTranscriptUrl('not-a-url')).toBe(false); expect(isValidTranscriptUrl("not-a-url")).toBe(false);
expect(isValidTranscriptUrl('ftp://example.com')).toBe(false); expect(isValidTranscriptUrl("ftp://example.com")).toBe(false);
expect(isValidTranscriptUrl('')).toBe(false); expect(isValidTranscriptUrl("")).toBe(false);
expect(isValidTranscriptUrl(null as any)).toBe(false); expect(isValidTranscriptUrl(null as any)).toBe(false);
expect(isValidTranscriptUrl(undefined as any)).toBe(false); expect(isValidTranscriptUrl(undefined as any)).toBe(false);
}); });
it('should handle malformed URLs', () => { it("should handle malformed URLs", () => {
expect(isValidTranscriptUrl('http://')).toBe(false); expect(isValidTranscriptUrl("http://")).toBe(false);
expect(isValidTranscriptUrl('https://')).toBe(false); expect(isValidTranscriptUrl("https://")).toBe(false);
expect(isValidTranscriptUrl('://example.com')).toBe(false); expect(isValidTranscriptUrl("://example.com")).toBe(false);
}); });
}); });
describe('extractSessionIdFromTranscript', () => { describe("extractSessionIdFromTranscript", () => {
it('should extract session ID from session_id pattern', () => { it("should extract session ID from session_id pattern", () => {
const content = 'session_id: abc123def456\nOther content...'; const content = "session_id: abc123def456\nOther content...";
const result = extractSessionIdFromTranscript(content); const result = extractSessionIdFromTranscript(content);
expect(result).toBe('abc123def456'); expect(result).toBe("abc123def456");
}); });
it('should extract session ID from sessionId pattern', () => { it("should extract session ID from sessionId pattern", () => {
const content = 'sessionId: xyz789\nTranscript data...'; const content = "sessionId: xyz789\nTranscript data...";
const result = extractSessionIdFromTranscript(content); const result = extractSessionIdFromTranscript(content);
expect(result).toBe('xyz789'); expect(result).toBe("xyz789");
}); });
it('should extract session ID from id pattern', () => { it("should extract session ID from id pattern", () => {
const content = 'id: session-12345678\nChat log...'; const content = "id: session-12345678\nChat log...";
const result = extractSessionIdFromTranscript(content); const result = extractSessionIdFromTranscript(content);
expect(result).toBe('session-12345678'); expect(result).toBe("session-12345678");
}); });
it('should extract session ID from first line', () => { it("should extract session ID from first line", () => {
const content = 'abc123def456\nUser: Hello\nBot: Hi there'; const content = "abc123def456\nUser: Hello\nBot: Hi there";
const result = extractSessionIdFromTranscript(content); const result = extractSessionIdFromTranscript(content);
expect(result).toBe('abc123def456'); expect(result).toBe("abc123def456");
}); });
it('should return null for content without session ID', () => { it("should return null for content without session ID", () => {
const content = 'User: Hello\nBot: Hi there\nUser: How are you?'; const content = "User: Hello\nBot: Hi there\nUser: How are you?";
const result = extractSessionIdFromTranscript(content); const result = extractSessionIdFromTranscript(content);
expect(result).toBe(null); expect(result).toBe(null);
}); });
it('should return null for empty content', () => { it("should return null for empty content", () => {
expect(extractSessionIdFromTranscript('')).toBe(null); expect(extractSessionIdFromTranscript("")).toBe(null);
expect(extractSessionIdFromTranscript(null as any)).toBe(null); expect(extractSessionIdFromTranscript(null as any)).toBe(null);
expect(extractSessionIdFromTranscript(undefined as any)).toBe(null); expect(extractSessionIdFromTranscript(undefined as any)).toBe(null);
}); });
it('should handle case-insensitive patterns', () => { it("should handle case-insensitive patterns", () => {
const content = 'SESSION_ID: ABC123\nContent...'; const content = "SESSION_ID: ABC123\nContent...";
const result = extractSessionIdFromTranscript(content); const result = extractSessionIdFromTranscript(content);
expect(result).toBe('ABC123'); expect(result).toBe("ABC123");
}); });
it('should extract the first matching pattern', () => { it("should extract the first matching pattern", () => {
const content = 'session_id: first123\nid: second456\nMore content...'; const content = "session_id: first123\nid: second456\nMore content...";
const result = extractSessionIdFromTranscript(content); const result = extractSessionIdFromTranscript(content);
expect(result).toBe('first123'); expect(result).toBe("first123");
}); });
}); });
}); });

View File

@ -1,23 +1,23 @@
import { defineConfig } from 'vitest/config' import { defineConfig } from "vitest/config";
import react from '@vitejs/plugin-react' import react from "@vitejs/plugin-react";
import tsconfigPaths from 'vite-tsconfig-paths' import tsconfigPaths from "vite-tsconfig-paths";
export default defineConfig({ export default defineConfig({
plugins: [tsconfigPaths(), react()], plugins: [tsconfigPaths(), react()],
test: { test: {
environment: 'node', environment: "node",
globals: true, globals: true,
setupFiles: ['./tests/setup.ts'], setupFiles: ["./tests/setup.ts"],
include: ['tests/**/*.{test,spec}.{js,mjs,cjs,ts,mts,cts,jsx,tsx}'], include: ["tests/**/*.{test,spec}.{js,mjs,cjs,ts,mts,cts,jsx,tsx}"],
env: { env: {
NODE_ENV: 'test', NODE_ENV: "test",
}, },
coverage: { coverage: {
provider: 'v8', provider: "v8",
reporter: ['text', 'lcov', 'html'], reporter: ["text", "lcov", "html"],
include: ['lib/**/*.ts'], include: ["lib/**/*.ts"],
exclude: ['lib/**/*.d.ts', 'lib/**/*.test.ts'], exclude: ["lib/**/*.d.ts", "lib/**/*.test.ts"],
}, },
testTimeout: 10000, testTimeout: 10000,
}, },
}) });