mirror of
https://github.com/kjanat/livedash-node.git
synced 2026-01-16 08:52:10 +01:00
Refactor code for improved readability and consistency
- Updated formatting in SessionDetails component for better readability. - Enhanced documentation in scheduler-fixes.md to clarify issues and solutions. - Improved error handling and logging in csvFetcher.js and processingScheduler.js. - Standardized code formatting across various scripts and components for consistency. - Added validation checks for CSV URLs and transcript content to prevent processing errors. - Enhanced logging messages for better tracking of processing status and errors.
This commit is contained in:
130
TODO.md
130
TODO.md
@ -2,95 +2,107 @@
|
|||||||
|
|
||||||
## Dashboard Integration
|
## Dashboard Integration
|
||||||
|
|
||||||
- [ ] **Resolve `GeographicMap.tsx` and `ResponseTimeDistribution.tsx` data simulation**
|
- [ ] **Resolve `GeographicMap.tsx` and `ResponseTimeDistribution.tsx` data simulation**
|
||||||
- Investigate integrating real data sources with server-side analytics
|
- Investigate integrating real data sources with server-side analytics
|
||||||
- Replace simulated data mentioned in `docs/dashboard-components.md`
|
- Replace simulated data mentioned in `docs/dashboard-components.md`
|
||||||
|
|
||||||
## Component Specific
|
## Component Specific
|
||||||
|
|
||||||
- [ ] **Implement robust emailing of temporary passwords**
|
- [ ] **Implement robust emailing of temporary passwords**
|
||||||
- File: `pages/api/dashboard/users.ts`
|
|
||||||
- Set up proper email service integration
|
|
||||||
|
|
||||||
- [x] **Session page improvements** ✅
|
- File: `pages/api/dashboard/users.ts`
|
||||||
- File: `app/dashboard/sessions/page.tsx`
|
- Set up proper email service integration
|
||||||
- Implemented pagination, advanced filtering, and sorting
|
|
||||||
|
- [x] **Session page improvements** ✅
|
||||||
|
- File: `app/dashboard/sessions/page.tsx`
|
||||||
|
- Implemented pagination, advanced filtering, and sorting
|
||||||
|
|
||||||
## File Cleanup
|
## File Cleanup
|
||||||
|
|
||||||
- [x] **Remove backup files** ✅
|
- [x] **Remove backup files** ✅
|
||||||
- Reviewed and removed `.bak` and `.new` files after integration
|
- Reviewed and removed `.bak` and `.new` files after integration
|
||||||
- Cleaned up `GeographicMap.tsx.bak`, `SessionDetails.tsx.bak`, `SessionDetails.tsx.new`
|
- Cleaned up `GeographicMap.tsx.bak`, `SessionDetails.tsx.bak`, `SessionDetails.tsx.new`
|
||||||
|
|
||||||
## Database Schema Improvements
|
## Database Schema Improvements
|
||||||
|
|
||||||
- [ ] **Update EndTime field**
|
- [ ] **Update EndTime field**
|
||||||
- Make `endTime` field nullable in Prisma schema to match TypeScript interfaces
|
|
||||||
|
|
||||||
- [ ] **Add database indices**
|
- Make `endTime` field nullable in Prisma schema to match TypeScript interfaces
|
||||||
- Add appropriate indices to improve query performance
|
|
||||||
- Focus on dashboard metrics and session listing queries
|
|
||||||
|
|
||||||
- [ ] **Implement production email service**
|
- [ ] **Add database indices**
|
||||||
- Replace console logging in `lib/sendEmail.ts`
|
|
||||||
- Consider providers: Nodemailer, SendGrid, AWS SES
|
- Add appropriate indices to improve query performance
|
||||||
|
- Focus on dashboard metrics and session listing queries
|
||||||
|
|
||||||
|
- [ ] **Implement production email service**
|
||||||
|
- Replace console logging in `lib/sendEmail.ts`
|
||||||
|
- Consider providers: Nodemailer, SendGrid, AWS SES
|
||||||
|
|
||||||
## General Enhancements & Features
|
## General Enhancements & Features
|
||||||
|
|
||||||
- [ ] **Real-time updates**
|
- [ ] **Real-time updates**
|
||||||
- Implement for dashboard and session list
|
|
||||||
- Consider WebSockets or Server-Sent Events
|
|
||||||
|
|
||||||
- [ ] **Data export functionality**
|
- Implement for dashboard and session list
|
||||||
- Allow users (especially admins) to export session data
|
- Consider WebSockets or Server-Sent Events
|
||||||
- Support CSV format initially
|
|
||||||
|
|
||||||
- [ ] **Customizable dashboard**
|
- [ ] **Data export functionality**
|
||||||
- Allow users to customize dashboard view
|
|
||||||
- Let users choose which metrics/charts are most important
|
- Allow users (especially admins) to export session data
|
||||||
|
- Support CSV format initially
|
||||||
|
|
||||||
|
- [ ] **Customizable dashboard**
|
||||||
|
- Allow users to customize dashboard view
|
||||||
|
- Let users choose which metrics/charts are most important
|
||||||
|
|
||||||
## Testing & Quality Assurance
|
## Testing & Quality Assurance
|
||||||
|
|
||||||
- [ ] **Comprehensive testing suite**
|
- [ ] **Comprehensive testing suite**
|
||||||
- [ ] Unit tests for utility functions and API logic
|
|
||||||
- [ ] Integration tests for API endpoints with database
|
|
||||||
- [ ] End-to-end tests for user flows (Playwright or Cypress)
|
|
||||||
|
|
||||||
- [ ] **Error monitoring and logging**
|
- [ ] Unit tests for utility functions and API logic
|
||||||
- Integrate robust error monitoring service (Sentry)
|
- [ ] Integration tests for API endpoints with database
|
||||||
- Enhance server-side logging
|
- [ ] End-to-end tests for user flows (Playwright or Cypress)
|
||||||
|
|
||||||
- [ ] **Accessibility improvements**
|
- [ ] **Error monitoring and logging**
|
||||||
- Review application against WCAG guidelines
|
|
||||||
- Improve keyboard navigation and screen reader compatibility
|
- Integrate robust error monitoring service (Sentry)
|
||||||
- Check color contrast ratios
|
- Enhance server-side logging
|
||||||
|
|
||||||
|
- [ ] **Accessibility improvements**
|
||||||
|
- Review application against WCAG guidelines
|
||||||
|
- Improve keyboard navigation and screen reader compatibility
|
||||||
|
- Check color contrast ratios
|
||||||
|
|
||||||
## Security Enhancements
|
## Security Enhancements
|
||||||
|
|
||||||
- [x] **Password reset functionality** ✅
|
- [x] **Password reset functionality** ✅
|
||||||
- Implemented secure password reset mechanism
|
|
||||||
- Files: `app/forgot-password/page.tsx`, `app/reset-password/page.tsx`, `pages/api/forgot-password.ts`, `pages/api/reset-password.ts`
|
|
||||||
|
|
||||||
- [ ] **Two-Factor Authentication (2FA)**
|
- Implemented secure password reset mechanism
|
||||||
- Consider adding 2FA, especially for admin accounts
|
- Files: `app/forgot-password/page.tsx`, `app/reset-password/page.tsx`, `pages/api/forgot-password.ts`, `pages/api/reset-password.ts`
|
||||||
|
|
||||||
- [ ] **Input validation and sanitization**
|
- [ ] **Two-Factor Authentication (2FA)**
|
||||||
- Review all user inputs (API request bodies, query parameters)
|
|
||||||
- Ensure proper validation and sanitization
|
- Consider adding 2FA, especially for admin accounts
|
||||||
|
|
||||||
|
- [ ] **Input validation and sanitization**
|
||||||
|
- Review all user inputs (API request bodies, query parameters)
|
||||||
|
- Ensure proper validation and sanitization
|
||||||
|
|
||||||
## Code Quality & Development
|
## Code Quality & Development
|
||||||
|
|
||||||
- [ ] **Code review process**
|
- [ ] **Code review process**
|
||||||
- Enforce code reviews for all changes
|
|
||||||
|
|
||||||
- [ ] **Environment configuration**
|
- Enforce code reviews for all changes
|
||||||
- Ensure secure management of environment-specific configurations
|
|
||||||
|
|
||||||
- [ ] **Dependency management**
|
- [ ] **Environment configuration**
|
||||||
- Periodically review dependencies for vulnerabilities
|
|
||||||
- Keep dependencies updated
|
|
||||||
|
|
||||||
- [ ] **Documentation updates**
|
- Ensure secure management of environment-specific configurations
|
||||||
- [ ] Ensure `docs/dashboard-components.md` reflects actual implementations
|
|
||||||
- [ ] Verify "Dashboard Enhancements" are consistently applied
|
- [ ] **Dependency management**
|
||||||
- [ ] Update documentation for improved layout and visual hierarchies
|
|
||||||
|
- Periodically review dependencies for vulnerabilities
|
||||||
|
- Keep dependencies updated
|
||||||
|
|
||||||
|
- [ ] **Documentation updates**
|
||||||
|
- [ ] Ensure `docs/dashboard-components.md` reflects actual implementations
|
||||||
|
- [ ] Verify "Dashboard Enhancements" are consistently applied
|
||||||
|
- [ ] Update documentation for improved layout and visual hierarchies
|
||||||
|
|||||||
@ -30,18 +30,18 @@ export default function MessageViewer({ messages }: MessageViewerProps) {
|
|||||||
<div
|
<div
|
||||||
key={message.id}
|
key={message.id}
|
||||||
className={`flex ${
|
className={`flex ${
|
||||||
message.role.toLowerCase() === 'user'
|
message.role.toLowerCase() === "user"
|
||||||
? 'justify-end'
|
? "justify-end"
|
||||||
: 'justify-start'
|
: "justify-start"
|
||||||
}`}
|
}`}
|
||||||
>
|
>
|
||||||
<div
|
<div
|
||||||
className={`max-w-xs lg:max-w-md px-4 py-2 rounded-lg ${
|
className={`max-w-xs lg:max-w-md px-4 py-2 rounded-lg ${
|
||||||
message.role.toLowerCase() === 'user'
|
message.role.toLowerCase() === "user"
|
||||||
? 'bg-blue-500 text-white'
|
? "bg-blue-500 text-white"
|
||||||
: message.role.toLowerCase() === 'assistant'
|
: message.role.toLowerCase() === "assistant"
|
||||||
? 'bg-gray-200 text-gray-800'
|
? "bg-gray-200 text-gray-800"
|
||||||
: 'bg-yellow-100 text-yellow-800'
|
: "bg-yellow-100 text-yellow-800"
|
||||||
}`}
|
}`}
|
||||||
>
|
>
|
||||||
<div className="flex items-center justify-between mb-1">
|
<div className="flex items-center justify-between mb-1">
|
||||||
@ -66,7 +66,8 @@ export default function MessageViewer({ messages }: MessageViewerProps) {
|
|||||||
First message: {new Date(messages[0].timestamp).toLocaleString()}
|
First message: {new Date(messages[0].timestamp).toLocaleString()}
|
||||||
</span>
|
</span>
|
||||||
<span>
|
<span>
|
||||||
Last message: {new Date(messages[messages.length - 1].timestamp).toLocaleString()}
|
Last message:{" "}
|
||||||
|
{new Date(messages[messages.length - 1].timestamp).toLocaleString()}
|
||||||
</span>
|
</span>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@ -161,7 +161,9 @@ export default function SessionDetails({ session }: SessionDetailsProps) {
|
|||||||
{session.ipAddress && (
|
{session.ipAddress && (
|
||||||
<div className="flex justify-between border-b pb-2">
|
<div className="flex justify-between border-b pb-2">
|
||||||
<span className="text-gray-600">IP Address:</span>
|
<span className="text-gray-600">IP Address:</span>
|
||||||
<span className="font-medium font-mono text-sm">{session.ipAddress}</span>
|
<span className="font-medium font-mono text-sm">
|
||||||
|
{session.ipAddress}
|
||||||
|
</span>
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
|
|||||||
@ -3,25 +3,31 @@
|
|||||||
## Issues Identified and Resolved
|
## Issues Identified and Resolved
|
||||||
|
|
||||||
### 1. Invalid Company Configuration
|
### 1. Invalid Company Configuration
|
||||||
|
|
||||||
**Problem**: Company `26fc3d34-c074-4556-85bd-9a66fafc0e08` had an invalid CSV URL (`https://example.com/data.csv`) with no authentication credentials.
|
**Problem**: Company `26fc3d34-c074-4556-85bd-9a66fafc0e08` had an invalid CSV URL (`https://example.com/data.csv`) with no authentication credentials.
|
||||||
|
|
||||||
**Solution**:
|
**Solution**:
|
||||||
|
|
||||||
- Added validation in `fetchAndStoreSessionsForAllCompanies()` to skip companies with example/invalid URLs
|
- Added validation in `fetchAndStoreSessionsForAllCompanies()` to skip companies with example/invalid URLs
|
||||||
- Removed the invalid company record from the database using `fix_companies.js`
|
- Removed the invalid company record from the database using `fix_companies.js`
|
||||||
|
|
||||||
### 2. Transcript Fetching Errors
|
### 2. Transcript Fetching Errors
|
||||||
|
|
||||||
**Problem**: Multiple "Error fetching transcript: Unauthorized" messages were flooding the logs when individual transcript files couldn't be accessed.
|
**Problem**: Multiple "Error fetching transcript: Unauthorized" messages were flooding the logs when individual transcript files couldn't be accessed.
|
||||||
|
|
||||||
**Solution**:
|
**Solution**:
|
||||||
|
|
||||||
- Improved error handling in `fetchTranscriptContent()` function
|
- Improved error handling in `fetchTranscriptContent()` function
|
||||||
- Added probabilistic logging (only ~10% of errors logged) to prevent log spam
|
- Added probabilistic logging (only ~10% of errors logged) to prevent log spam
|
||||||
- Added timeout (10 seconds) for transcript fetching
|
- Added timeout (10 seconds) for transcript fetching
|
||||||
- Made transcript fetching failures non-blocking (sessions are still created without transcript content)
|
- Made transcript fetching failures non-blocking (sessions are still created without transcript content)
|
||||||
|
|
||||||
### 3. CSV Fetching Errors
|
### 3. CSV Fetching Errors
|
||||||
|
|
||||||
**Problem**: "Failed to fetch CSV: Not Found" errors for companies with invalid URLs.
|
**Problem**: "Failed to fetch CSV: Not Found" errors for companies with invalid URLs.
|
||||||
|
|
||||||
**Solution**:
|
**Solution**:
|
||||||
|
|
||||||
- Added URL validation to skip companies with `example.com` URLs
|
- Added URL validation to skip companies with `example.com` URLs
|
||||||
- Improved error logging to be more descriptive
|
- Improved error logging to be more descriptive
|
||||||
|
|
||||||
@ -35,6 +41,7 @@
|
|||||||
## Remaining Companies
|
## Remaining Companies
|
||||||
|
|
||||||
After cleanup, only valid companies remain:
|
After cleanup, only valid companies remain:
|
||||||
|
|
||||||
- **Demo Company** (`790b9233-d369-451f-b92c-f4dceb42b649`)
|
- **Demo Company** (`790b9233-d369-451f-b92c-f4dceb42b649`)
|
||||||
- CSV URL: `https://proto.notso.ai/jumbo/chats`
|
- CSV URL: `https://proto.notso.ai/jumbo/chats`
|
||||||
- Has valid authentication credentials
|
- Has valid authentication credentials
|
||||||
@ -43,6 +50,7 @@ After cleanup, only valid companies remain:
|
|||||||
## Files Modified
|
## Files Modified
|
||||||
|
|
||||||
1. **lib/csvFetcher.js**
|
1. **lib/csvFetcher.js**
|
||||||
|
|
||||||
- Added company URL validation
|
- Added company URL validation
|
||||||
- Improved transcript fetching error handling
|
- Improved transcript fetching error handling
|
||||||
- Reduced error log verbosity
|
- Reduced error log verbosity
|
||||||
|
|||||||
@ -410,15 +410,19 @@ async function fetchTranscriptContent(url, username, password) {
|
|||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
// Only log error once per batch, not for every transcript
|
// Only log error once per batch, not for every transcript
|
||||||
if (Math.random() < 0.1) { // Log ~10% of errors to avoid spam
|
if (Math.random() < 0.1) {
|
||||||
console.warn(`[CSV] Transcript fetch failed for ${url}: ${response.status} ${response.statusText}`);
|
// Log ~10% of errors to avoid spam
|
||||||
|
console.warn(
|
||||||
|
`[CSV] Transcript fetch failed for ${url}: ${response.status} ${response.statusText}`
|
||||||
|
);
|
||||||
}
|
}
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
return await response.text();
|
return await response.text();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
// Only log error once per batch, not for every transcript
|
// Only log error once per batch, not for every transcript
|
||||||
if (Math.random() < 0.1) { // Log ~10% of errors to avoid spam
|
if (Math.random() < 0.1) {
|
||||||
|
// Log ~10% of errors to avoid spam
|
||||||
console.warn(`[CSV] Transcript fetch error for ${url}:`, error.message);
|
console.warn(`[CSV] Transcript fetch error for ${url}:`, error.message);
|
||||||
}
|
}
|
||||||
return null;
|
return null;
|
||||||
@ -505,13 +509,20 @@ export async function fetchAndStoreSessionsForAllCompanies() {
|
|||||||
|
|
||||||
for (const company of companies) {
|
for (const company of companies) {
|
||||||
if (!company.csvUrl) {
|
if (!company.csvUrl) {
|
||||||
console.log(`[Scheduler] Skipping company ${company.id} - no CSV URL configured`);
|
console.log(
|
||||||
|
`[Scheduler] Skipping company ${company.id} - no CSV URL configured`
|
||||||
|
);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Skip companies with invalid/example URLs
|
// Skip companies with invalid/example URLs
|
||||||
if (company.csvUrl.includes('example.com') || company.csvUrl === 'https://example.com/data.csv') {
|
if (
|
||||||
console.log(`[Scheduler] Skipping company ${company.id} - invalid/example CSV URL: ${company.csvUrl}`);
|
company.csvUrl.includes("example.com") ||
|
||||||
|
company.csvUrl === "https://example.com/data.csv"
|
||||||
|
) {
|
||||||
|
console.log(
|
||||||
|
`[Scheduler] Skipping company ${company.id} - invalid/example CSV URL: ${company.csvUrl}`
|
||||||
|
);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -581,11 +592,17 @@ export async function fetchAndStoreSessionsForAllCompanies() {
|
|||||||
country: session.country || null,
|
country: session.country || null,
|
||||||
language: session.language || null,
|
language: session.language || null,
|
||||||
messagesSent:
|
messagesSent:
|
||||||
typeof session.messagesSent === "number" ? session.messagesSent : 0,
|
typeof session.messagesSent === "number"
|
||||||
|
? session.messagesSent
|
||||||
|
: 0,
|
||||||
sentiment:
|
sentiment:
|
||||||
typeof session.sentiment === "number" ? session.sentiment : null,
|
typeof session.sentiment === "number"
|
||||||
|
? session.sentiment
|
||||||
|
: null,
|
||||||
escalated:
|
escalated:
|
||||||
typeof session.escalated === "boolean" ? session.escalated : null,
|
typeof session.escalated === "boolean"
|
||||||
|
? session.escalated
|
||||||
|
: null,
|
||||||
forwardedHr:
|
forwardedHr:
|
||||||
typeof session.forwardedHr === "boolean"
|
typeof session.forwardedHr === "boolean"
|
||||||
? session.forwardedHr
|
? session.forwardedHr
|
||||||
@ -596,9 +613,12 @@ export async function fetchAndStoreSessionsForAllCompanies() {
|
|||||||
typeof session.avgResponseTime === "number"
|
typeof session.avgResponseTime === "number"
|
||||||
? session.avgResponseTime
|
? session.avgResponseTime
|
||||||
: null,
|
: null,
|
||||||
tokens: typeof session.tokens === "number" ? session.tokens : null,
|
tokens:
|
||||||
|
typeof session.tokens === "number" ? session.tokens : null,
|
||||||
tokensEur:
|
tokensEur:
|
||||||
typeof session.tokensEur === "number" ? session.tokensEur : null,
|
typeof session.tokensEur === "number"
|
||||||
|
? session.tokensEur
|
||||||
|
: null,
|
||||||
category: session.category || null,
|
category: session.category || null,
|
||||||
initialMsg: session.initialMsg || null,
|
initialMsg: session.initialMsg || null,
|
||||||
},
|
},
|
||||||
@ -607,9 +627,14 @@ export async function fetchAndStoreSessionsForAllCompanies() {
|
|||||||
addedCount++;
|
addedCount++;
|
||||||
}
|
}
|
||||||
|
|
||||||
console.log(`[Scheduler] Added ${addedCount} new sessions for company ${company.id}`);
|
console.log(
|
||||||
|
`[Scheduler] Added ${addedCount} new sessions for company ${company.id}`
|
||||||
|
);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error(`[Scheduler] Error processing company ${company.id}:`, error);
|
console.error(
|
||||||
|
`[Scheduler] Error processing company ${company.id}:`,
|
||||||
|
error
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
|
|||||||
@ -126,7 +126,9 @@ function validateOpenAIResponse(data) {
|
|||||||
|
|
||||||
// Validate field types
|
// Validate field types
|
||||||
if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) {
|
if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) {
|
||||||
throw new Error("Invalid language format. Expected ISO 639-1 code (e.g., 'en')");
|
throw new Error(
|
||||||
|
"Invalid language format. Expected ISO 639-1 code (e.g., 'en')"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.messages_sent !== "number" || data.messages_sent < 0) {
|
if (typeof data.messages_sent !== "number" || data.messages_sent < 0) {
|
||||||
@ -134,7 +136,9 @@ function validateOpenAIResponse(data) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (!["positive", "neutral", "negative"].includes(data.sentiment)) {
|
if (!["positive", "neutral", "negative"].includes(data.sentiment)) {
|
||||||
throw new Error("Invalid sentiment. Expected 'positive', 'neutral', or 'negative'");
|
throw new Error(
|
||||||
|
"Invalid sentiment. Expected 'positive', 'neutral', or 'negative'"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.escalated !== "boolean") {
|
if (typeof data.escalated !== "boolean") {
|
||||||
@ -162,15 +166,23 @@ function validateOpenAIResponse(data) {
|
|||||||
];
|
];
|
||||||
|
|
||||||
if (!validCategories.includes(data.category)) {
|
if (!validCategories.includes(data.category)) {
|
||||||
throw new Error(`Invalid category. Expected one of: ${validCategories.join(", ")}`);
|
throw new Error(
|
||||||
|
`Invalid category. Expected one of: ${validCategories.join(", ")}`
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!Array.isArray(data.questions)) {
|
if (!Array.isArray(data.questions)) {
|
||||||
throw new Error("Invalid questions. Expected array of strings");
|
throw new Error("Invalid questions. Expected array of strings");
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.summary !== "string" || data.summary.length < 10 || data.summary.length > 300) {
|
if (
|
||||||
throw new Error("Invalid summary. Expected string between 10-300 characters");
|
typeof data.summary !== "string" ||
|
||||||
|
data.summary.length < 10 ||
|
||||||
|
data.summary.length > 300
|
||||||
|
) {
|
||||||
|
throw new Error(
|
||||||
|
"Invalid summary. Expected string between 10-300 characters"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.session_id !== "string") {
|
if (typeof data.session_id !== "string") {
|
||||||
@ -182,7 +194,9 @@ function validateOpenAIResponse(data) {
|
|||||||
* Process unprocessed sessions
|
* Process unprocessed sessions
|
||||||
*/
|
*/
|
||||||
export async function processUnprocessedSessions() {
|
export async function processUnprocessedSessions() {
|
||||||
process.stdout.write("[ProcessingScheduler] Starting to process unprocessed sessions...\n");
|
process.stdout.write(
|
||||||
|
"[ProcessingScheduler] Starting to process unprocessed sessions...\n"
|
||||||
|
);
|
||||||
|
|
||||||
// Find sessions that have messages but haven't been processed
|
// Find sessions that have messages but haven't been processed
|
||||||
const sessionsToProcess = await prisma.session.findMany({
|
const sessionsToProcess = await prisma.session.findMany({
|
||||||
@ -193,43 +207,58 @@ export async function processUnprocessedSessions() {
|
|||||||
},
|
},
|
||||||
include: {
|
include: {
|
||||||
messages: {
|
messages: {
|
||||||
orderBy: { order: 'asc' }
|
orderBy: { order: "asc" },
|
||||||
}
|
},
|
||||||
},
|
},
|
||||||
take: 10, // Process in batches to avoid overloading the system
|
take: 10, // Process in batches to avoid overloading the system
|
||||||
});
|
});
|
||||||
|
|
||||||
// Filter to only sessions that have messages
|
// Filter to only sessions that have messages
|
||||||
const sessionsWithMessages = sessionsToProcess.filter(session => session.messages.length > 0);
|
const sessionsWithMessages = sessionsToProcess.filter(
|
||||||
|
(session) => session.messages.length > 0
|
||||||
|
);
|
||||||
|
|
||||||
if (sessionsWithMessages.length === 0) {
|
if (sessionsWithMessages.length === 0) {
|
||||||
process.stdout.write("[ProcessingScheduler] No sessions found requiring processing.\n");
|
process.stdout.write(
|
||||||
|
"[ProcessingScheduler] No sessions found requiring processing.\n"
|
||||||
|
);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
process.stdout.write(`[ProcessingScheduler] Found ${sessionsWithMessages.length} sessions to process.\n`);
|
process.stdout.write(
|
||||||
|
`[ProcessingScheduler] Found ${sessionsWithMessages.length} sessions to process.\n`
|
||||||
|
);
|
||||||
let successCount = 0;
|
let successCount = 0;
|
||||||
let errorCount = 0;
|
let errorCount = 0;
|
||||||
|
|
||||||
for (const session of sessionsWithMessages) {
|
for (const session of sessionsWithMessages) {
|
||||||
if (session.messages.length === 0) {
|
if (session.messages.length === 0) {
|
||||||
process.stderr.write(`[ProcessingScheduler] Session ${session.id} has no messages, skipping.\n`);
|
process.stderr.write(
|
||||||
|
`[ProcessingScheduler] Session ${session.id} has no messages, skipping.\n`
|
||||||
|
);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
process.stdout.write(`[ProcessingScheduler] Processing messages for session ${session.id}...\n`);
|
process.stdout.write(
|
||||||
|
`[ProcessingScheduler] Processing messages for session ${session.id}...\n`
|
||||||
|
);
|
||||||
try {
|
try {
|
||||||
// Convert messages back to transcript format for OpenAI processing
|
// Convert messages back to transcript format for OpenAI processing
|
||||||
const transcript = session.messages.map(msg =>
|
const transcript = session.messages
|
||||||
`[${new Date(msg.timestamp).toLocaleString('en-GB', {
|
.map(
|
||||||
day: '2-digit',
|
(msg) =>
|
||||||
month: '2-digit',
|
`[${new Date(msg.timestamp)
|
||||||
year: 'numeric',
|
.toLocaleString("en-GB", {
|
||||||
hour: '2-digit',
|
day: "2-digit",
|
||||||
minute: '2-digit',
|
month: "2-digit",
|
||||||
second: '2-digit'
|
year: "numeric",
|
||||||
}).replace(',', '')}] ${msg.role}: ${msg.content}`
|
hour: "2-digit",
|
||||||
).join('\n');
|
minute: "2-digit",
|
||||||
|
second: "2-digit",
|
||||||
|
})
|
||||||
|
.replace(",", "")}] ${msg.role}: ${msg.content}`
|
||||||
|
)
|
||||||
|
.join("\n");
|
||||||
|
|
||||||
const processedData = await processTranscriptWithOpenAI(
|
const processedData = await processTranscriptWithOpenAI(
|
||||||
session.id,
|
session.id,
|
||||||
@ -260,17 +289,25 @@ export async function processUnprocessedSessions() {
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
process.stdout.write(`[ProcessingScheduler] Successfully processed session ${session.id}.\n`);
|
process.stdout.write(
|
||||||
|
`[ProcessingScheduler] Successfully processed session ${session.id}.\n`
|
||||||
|
);
|
||||||
successCount++;
|
successCount++;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
process.stderr.write(`[ProcessingScheduler] Error processing session ${session.id}: ${error}\n`);
|
process.stderr.write(
|
||||||
|
`[ProcessingScheduler] Error processing session ${session.id}: ${error}\n`
|
||||||
|
);
|
||||||
errorCount++;
|
errorCount++;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
process.stdout.write("[ProcessingScheduler] Session processing complete.\n");
|
process.stdout.write("[ProcessingScheduler] Session processing complete.\n");
|
||||||
process.stdout.write(`[ProcessingScheduler] Successfully processed: ${successCount} sessions.\n`);
|
process.stdout.write(
|
||||||
process.stdout.write(`[ProcessingScheduler] Failed to process: ${errorCount} sessions.\n`);
|
`[ProcessingScheduler] Successfully processed: ${successCount} sessions.\n`
|
||||||
|
);
|
||||||
|
process.stdout.write(
|
||||||
|
`[ProcessingScheduler] Failed to process: ${errorCount} sessions.\n`
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@ -282,9 +319,13 @@ export function startProcessingScheduler() {
|
|||||||
try {
|
try {
|
||||||
await processUnprocessedSessions();
|
await processUnprocessedSessions();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
process.stderr.write(`[ProcessingScheduler] Error in scheduler: ${error}\n`);
|
process.stderr.write(
|
||||||
|
`[ProcessingScheduler] Error in scheduler: ${error}\n`
|
||||||
|
);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
process.stdout.write("[ProcessingScheduler] Started processing scheduler (runs hourly).\n");
|
process.stdout.write(
|
||||||
|
"[ProcessingScheduler] Started processing scheduler (runs hourly).\n"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@ -103,7 +103,7 @@ async function processTranscriptWithOpenAI(
|
|||||||
throw new Error(`OpenAI API error: ${response.status} - ${errorText}`);
|
throw new Error(`OpenAI API error: ${response.status} - ${errorText}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
const data = await response.json() as any;
|
const data = (await response.json()) as any;
|
||||||
const processedData = JSON.parse(data.choices[0].message.content);
|
const processedData = JSON.parse(data.choices[0].message.content);
|
||||||
|
|
||||||
// Validate the response against our expected schema
|
// Validate the response against our expected schema
|
||||||
@ -120,7 +120,9 @@ async function processTranscriptWithOpenAI(
|
|||||||
* Validates the OpenAI response against our expected schema
|
* Validates the OpenAI response against our expected schema
|
||||||
* @param data The data to validate
|
* @param data The data to validate
|
||||||
*/
|
*/
|
||||||
function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData {
|
function validateOpenAIResponse(
|
||||||
|
data: any
|
||||||
|
): asserts data is OpenAIProcessedData {
|
||||||
// Check required fields
|
// Check required fields
|
||||||
const requiredFields = [
|
const requiredFields = [
|
||||||
"language",
|
"language",
|
||||||
@ -142,7 +144,9 @@ function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData
|
|||||||
|
|
||||||
// Validate field types
|
// Validate field types
|
||||||
if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) {
|
if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) {
|
||||||
throw new Error("Invalid language format. Expected ISO 639-1 code (e.g., 'en')");
|
throw new Error(
|
||||||
|
"Invalid language format. Expected ISO 639-1 code (e.g., 'en')"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.messages_sent !== "number" || data.messages_sent < 0) {
|
if (typeof data.messages_sent !== "number" || data.messages_sent < 0) {
|
||||||
@ -150,7 +154,9 @@ function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (!["positive", "neutral", "negative"].includes(data.sentiment)) {
|
if (!["positive", "neutral", "negative"].includes(data.sentiment)) {
|
||||||
throw new Error("Invalid sentiment. Expected 'positive', 'neutral', or 'negative'");
|
throw new Error(
|
||||||
|
"Invalid sentiment. Expected 'positive', 'neutral', or 'negative'"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.escalated !== "boolean") {
|
if (typeof data.escalated !== "boolean") {
|
||||||
@ -178,15 +184,23 @@ function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData
|
|||||||
];
|
];
|
||||||
|
|
||||||
if (!validCategories.includes(data.category)) {
|
if (!validCategories.includes(data.category)) {
|
||||||
throw new Error(`Invalid category. Expected one of: ${validCategories.join(", ")}`);
|
throw new Error(
|
||||||
|
`Invalid category. Expected one of: ${validCategories.join(", ")}`
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!Array.isArray(data.questions)) {
|
if (!Array.isArray(data.questions)) {
|
||||||
throw new Error("Invalid questions. Expected array of strings");
|
throw new Error("Invalid questions. Expected array of strings");
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.summary !== "string" || data.summary.length < 10 || data.summary.length > 300) {
|
if (
|
||||||
throw new Error("Invalid summary. Expected string between 10-300 characters");
|
typeof data.summary !== "string" ||
|
||||||
|
data.summary.length < 10 ||
|
||||||
|
data.summary.length > 300
|
||||||
|
) {
|
||||||
|
throw new Error(
|
||||||
|
"Invalid summary. Expected string between 10-300 characters"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.session_id !== "string") {
|
if (typeof data.session_id !== "string") {
|
||||||
@ -198,7 +212,9 @@ function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData
|
|||||||
* Process unprocessed sessions
|
* Process unprocessed sessions
|
||||||
*/
|
*/
|
||||||
async function processUnprocessedSessions() {
|
async function processUnprocessedSessions() {
|
||||||
process.stdout.write("[ProcessingScheduler] Starting to process unprocessed sessions...\n");
|
process.stdout.write(
|
||||||
|
"[ProcessingScheduler] Starting to process unprocessed sessions...\n"
|
||||||
|
);
|
||||||
|
|
||||||
// Find sessions that have transcript content but haven't been processed
|
// Find sessions that have transcript content but haven't been processed
|
||||||
const sessionsToProcess = await prisma.session.findMany({
|
const sessionsToProcess = await prisma.session.findMany({
|
||||||
@ -217,22 +233,30 @@ async function processUnprocessedSessions() {
|
|||||||
});
|
});
|
||||||
|
|
||||||
if (sessionsToProcess.length === 0) {
|
if (sessionsToProcess.length === 0) {
|
||||||
process.stdout.write("[ProcessingScheduler] No sessions found requiring processing.\n");
|
process.stdout.write(
|
||||||
|
"[ProcessingScheduler] No sessions found requiring processing.\n"
|
||||||
|
);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
process.stdout.write(`[ProcessingScheduler] Found ${sessionsToProcess.length} sessions to process.\n`);
|
process.stdout.write(
|
||||||
|
`[ProcessingScheduler] Found ${sessionsToProcess.length} sessions to process.\n`
|
||||||
|
);
|
||||||
let successCount = 0;
|
let successCount = 0;
|
||||||
let errorCount = 0;
|
let errorCount = 0;
|
||||||
|
|
||||||
for (const session of sessionsToProcess) {
|
for (const session of sessionsToProcess) {
|
||||||
if (!session.transcriptContent) {
|
if (!session.transcriptContent) {
|
||||||
// Should not happen due to query, but good for type safety
|
// Should not happen due to query, but good for type safety
|
||||||
process.stderr.write(`[ProcessingScheduler] Session ${session.id} has no transcript content, skipping.\n`);
|
process.stderr.write(
|
||||||
|
`[ProcessingScheduler] Session ${session.id} has no transcript content, skipping.\n`
|
||||||
|
);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
process.stdout.write(`[ProcessingScheduler] Processing transcript for session ${session.id}...\n`);
|
process.stdout.write(
|
||||||
|
`[ProcessingScheduler] Processing transcript for session ${session.id}...\n`
|
||||||
|
);
|
||||||
try {
|
try {
|
||||||
const processedData = await processTranscriptWithOpenAI(
|
const processedData = await processTranscriptWithOpenAI(
|
||||||
session.id,
|
session.id,
|
||||||
@ -263,17 +287,25 @@ async function processUnprocessedSessions() {
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
process.stdout.write(`[ProcessingScheduler] Successfully processed session ${session.id}.\n`);
|
process.stdout.write(
|
||||||
|
`[ProcessingScheduler] Successfully processed session ${session.id}.\n`
|
||||||
|
);
|
||||||
successCount++;
|
successCount++;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
process.stderr.write(`[ProcessingScheduler] Error processing session ${session.id}: ${error}\n`);
|
process.stderr.write(
|
||||||
|
`[ProcessingScheduler] Error processing session ${session.id}: ${error}\n`
|
||||||
|
);
|
||||||
errorCount++;
|
errorCount++;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
process.stdout.write("[ProcessingScheduler] Session processing complete.\n");
|
process.stdout.write("[ProcessingScheduler] Session processing complete.\n");
|
||||||
process.stdout.write(`[ProcessingScheduler] Successfully processed: ${successCount} sessions.\n`);
|
process.stdout.write(
|
||||||
process.stdout.write(`[ProcessingScheduler] Failed to process: ${errorCount} sessions.\n`);
|
`[ProcessingScheduler] Successfully processed: ${successCount} sessions.\n`
|
||||||
|
);
|
||||||
|
process.stdout.write(
|
||||||
|
`[ProcessingScheduler] Failed to process: ${errorCount} sessions.\n`
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@ -285,9 +317,13 @@ export function startProcessingScheduler() {
|
|||||||
try {
|
try {
|
||||||
await processUnprocessedSessions();
|
await processUnprocessedSessions();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
process.stderr.write(`[ProcessingScheduler] Error in scheduler: ${error}\n`);
|
process.stderr.write(
|
||||||
|
`[ProcessingScheduler] Error in scheduler: ${error}\n`
|
||||||
|
);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
process.stdout.write("[ProcessingScheduler] Started processing scheduler (runs hourly).\n");
|
process.stdout.write(
|
||||||
|
"[ProcessingScheduler] Started processing scheduler (runs hourly).\n"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@ -31,5 +31,7 @@ export function startScheduler() {
|
|||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
console.log("[Scheduler] Started session refresh scheduler (runs every 15 minutes).");
|
console.log(
|
||||||
|
"[Scheduler] Started session refresh scheduler (runs every 15 minutes)."
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@ -10,17 +10,20 @@ const prisma = new PrismaClient();
|
|||||||
*/
|
*/
|
||||||
export function parseChatLogToJSON(logString) {
|
export function parseChatLogToJSON(logString) {
|
||||||
// Convert to string if it's not already
|
// Convert to string if it's not already
|
||||||
const stringData = typeof logString === 'string' ? logString : String(logString);
|
const stringData =
|
||||||
|
typeof logString === "string" ? logString : String(logString);
|
||||||
|
|
||||||
// Split by lines and filter out empty lines
|
// Split by lines and filter out empty lines
|
||||||
const lines = stringData.split('\n').filter(line => line.trim() !== '');
|
const lines = stringData.split("\n").filter((line) => line.trim() !== "");
|
||||||
|
|
||||||
const messages = [];
|
const messages = [];
|
||||||
let currentMessage = null;
|
let currentMessage = null;
|
||||||
|
|
||||||
for (const line of lines) {
|
for (const line of lines) {
|
||||||
// Check if line starts with a timestamp pattern [DD.MM.YYYY HH:MM:SS]
|
// Check if line starts with a timestamp pattern [DD.MM.YYYY HH:MM:SS]
|
||||||
const timestampMatch = line.match(/^\[(\d{2}\.\d{2}\.\d{4} \d{2}:\d{2}:\d{2})\] (.+?): (.*)$/);
|
const timestampMatch = line.match(
|
||||||
|
/^\[(\d{2}\.\d{2}\.\d{4} \d{2}:\d{2}:\d{2})\] (.+?): (.*)$/
|
||||||
|
);
|
||||||
|
|
||||||
if (timestampMatch) {
|
if (timestampMatch) {
|
||||||
// If we have a previous message, push it to the array
|
// If we have a previous message, push it to the array
|
||||||
@ -32,9 +35,9 @@ export function parseChatLogToJSON(logString) {
|
|||||||
const [, timestamp, sender, content] = timestampMatch;
|
const [, timestamp, sender, content] = timestampMatch;
|
||||||
|
|
||||||
// Convert DD.MM.YYYY HH:MM:SS to ISO format
|
// Convert DD.MM.YYYY HH:MM:SS to ISO format
|
||||||
const [datePart, timePart] = timestamp.split(' ');
|
const [datePart, timePart] = timestamp.split(" ");
|
||||||
const [day, month, year] = datePart.split('.');
|
const [day, month, year] = datePart.split(".");
|
||||||
const [hour, minute, second] = timePart.split(':');
|
const [hour, minute, second] = timePart.split(":");
|
||||||
|
|
||||||
const dateObject = new Date(year, month - 1, day, hour, minute, second);
|
const dateObject = new Date(year, month - 1, day, hour, minute, second);
|
||||||
|
|
||||||
@ -42,11 +45,11 @@ export function parseChatLogToJSON(logString) {
|
|||||||
currentMessage = {
|
currentMessage = {
|
||||||
timestamp: dateObject.toISOString(),
|
timestamp: dateObject.toISOString(),
|
||||||
role: sender,
|
role: sender,
|
||||||
content: content
|
content: content,
|
||||||
};
|
};
|
||||||
} else if (currentMessage) {
|
} else if (currentMessage) {
|
||||||
// This is a continuation of the previous message (multiline)
|
// This is a continuation of the previous message (multiline)
|
||||||
currentMessage.content += '\n' + line;
|
currentMessage.content += "\n" + line;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -67,7 +70,7 @@ export function parseChatLogToJSON(logString) {
|
|||||||
// This puts "User" before "Assistant" when timestamps are the same
|
// This puts "User" before "Assistant" when timestamps are the same
|
||||||
return b.role.localeCompare(a.role);
|
return b.role.localeCompare(a.role);
|
||||||
}),
|
}),
|
||||||
totalMessages: messages.length
|
totalMessages: messages.length,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -80,7 +83,7 @@ export async function storeMessagesForSession(sessionId, messages) {
|
|||||||
try {
|
try {
|
||||||
// First, delete any existing messages for this session
|
// First, delete any existing messages for this session
|
||||||
await prisma.message.deleteMany({
|
await prisma.message.deleteMany({
|
||||||
where: { sessionId }
|
where: { sessionId },
|
||||||
});
|
});
|
||||||
|
|
||||||
// Then insert the new messages
|
// Then insert the new messages
|
||||||
@ -89,19 +92,23 @@ export async function storeMessagesForSession(sessionId, messages) {
|
|||||||
timestamp: new Date(message.timestamp),
|
timestamp: new Date(message.timestamp),
|
||||||
role: message.role,
|
role: message.role,
|
||||||
content: message.content,
|
content: message.content,
|
||||||
order: index
|
order: index,
|
||||||
}));
|
}));
|
||||||
|
|
||||||
if (messageData.length > 0) {
|
if (messageData.length > 0) {
|
||||||
await prisma.message.createMany({
|
await prisma.message.createMany({
|
||||||
data: messageData
|
data: messageData,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
process.stdout.write(`[TranscriptParser] Stored ${messageData.length} messages for session ${sessionId}\n`);
|
process.stdout.write(
|
||||||
|
`[TranscriptParser] Stored ${messageData.length} messages for session ${sessionId}\n`
|
||||||
|
);
|
||||||
return messageData.length;
|
return messageData.length;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
process.stderr.write(`[TranscriptParser] Error storing messages for session ${sessionId}: ${error}\n`);
|
process.stderr.write(
|
||||||
|
`[TranscriptParser] Error storing messages for session ${sessionId}: ${error}\n`
|
||||||
|
);
|
||||||
throw error;
|
throw error;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -112,9 +119,12 @@ export async function storeMessagesForSession(sessionId, messages) {
|
|||||||
* @param {string} transcriptContent - Raw transcript content
|
* @param {string} transcriptContent - Raw transcript content
|
||||||
* @returns {Promise<Object>} Processing result with message count
|
* @returns {Promise<Object>} Processing result with message count
|
||||||
*/
|
*/
|
||||||
export async function processTranscriptForSession(sessionId, transcriptContent) {
|
export async function processTranscriptForSession(
|
||||||
if (!transcriptContent || transcriptContent.trim() === '') {
|
sessionId,
|
||||||
throw new Error('No transcript content provided');
|
transcriptContent
|
||||||
|
) {
|
||||||
|
if (!transcriptContent || transcriptContent.trim() === "") {
|
||||||
|
throw new Error("No transcript content provided");
|
||||||
}
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
@ -122,16 +132,21 @@ export async function processTranscriptForSession(sessionId, transcriptContent)
|
|||||||
const parsed = parseChatLogToJSON(transcriptContent);
|
const parsed = parseChatLogToJSON(transcriptContent);
|
||||||
|
|
||||||
// Store messages in database
|
// Store messages in database
|
||||||
const messageCount = await storeMessagesForSession(sessionId, parsed.messages);
|
const messageCount = await storeMessagesForSession(
|
||||||
|
sessionId,
|
||||||
|
parsed.messages
|
||||||
|
);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
sessionId,
|
sessionId,
|
||||||
messageCount,
|
messageCount,
|
||||||
totalMessages: parsed.totalMessages,
|
totalMessages: parsed.totalMessages,
|
||||||
success: true
|
success: true,
|
||||||
};
|
};
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
process.stderr.write(`[TranscriptParser] Error processing transcript for session ${sessionId}: ${error}\n`);
|
process.stderr.write(
|
||||||
|
`[TranscriptParser] Error processing transcript for session ${sessionId}: ${error}\n`
|
||||||
|
);
|
||||||
throw error;
|
throw error;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -140,7 +155,9 @@ export async function processTranscriptForSession(sessionId, transcriptContent)
|
|||||||
* Processes transcripts for all sessions that have transcript content but no parsed messages
|
* Processes transcripts for all sessions that have transcript content but no parsed messages
|
||||||
*/
|
*/
|
||||||
export async function processAllUnparsedTranscripts() {
|
export async function processAllUnparsedTranscripts() {
|
||||||
process.stdout.write("[TranscriptParser] Starting to process unparsed transcripts...\n");
|
process.stdout.write(
|
||||||
|
"[TranscriptParser] Starting to process unparsed transcripts...\n"
|
||||||
|
);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
// Find sessions with transcript content but no messages
|
// Find sessions with transcript content but no messages
|
||||||
@ -149,42 +166,58 @@ export async function processAllUnparsedTranscripts() {
|
|||||||
AND: [
|
AND: [
|
||||||
{ transcriptContent: { not: null } },
|
{ transcriptContent: { not: null } },
|
||||||
{ transcriptContent: { not: "" } },
|
{ transcriptContent: { not: "" } },
|
||||||
]
|
],
|
||||||
},
|
},
|
||||||
include: {
|
include: {
|
||||||
messages: true
|
messages: true,
|
||||||
}
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
// Filter to only sessions without messages
|
// Filter to only sessions without messages
|
||||||
const unparsedSessions = sessionsToProcess.filter(session => session.messages.length === 0);
|
const unparsedSessions = sessionsToProcess.filter(
|
||||||
|
(session) => session.messages.length === 0
|
||||||
|
);
|
||||||
|
|
||||||
if (unparsedSessions.length === 0) {
|
if (unparsedSessions.length === 0) {
|
||||||
process.stdout.write("[TranscriptParser] No unparsed transcripts found.\n");
|
process.stdout.write(
|
||||||
|
"[TranscriptParser] No unparsed transcripts found.\n"
|
||||||
|
);
|
||||||
return { processed: 0, errors: 0 };
|
return { processed: 0, errors: 0 };
|
||||||
}
|
}
|
||||||
|
|
||||||
process.stdout.write(`[TranscriptParser] Found ${unparsedSessions.length} sessions with unparsed transcripts.\n`);
|
process.stdout.write(
|
||||||
|
`[TranscriptParser] Found ${unparsedSessions.length} sessions with unparsed transcripts.\n`
|
||||||
|
);
|
||||||
|
|
||||||
let successCount = 0;
|
let successCount = 0;
|
||||||
let errorCount = 0;
|
let errorCount = 0;
|
||||||
|
|
||||||
for (const session of unparsedSessions) {
|
for (const session of unparsedSessions) {
|
||||||
try {
|
try {
|
||||||
const result = await processTranscriptForSession(session.id, session.transcriptContent);
|
const result = await processTranscriptForSession(
|
||||||
process.stdout.write(`[TranscriptParser] Processed session ${session.id}: ${result.messageCount} messages\n`);
|
session.id,
|
||||||
|
session.transcriptContent
|
||||||
|
);
|
||||||
|
process.stdout.write(
|
||||||
|
`[TranscriptParser] Processed session ${session.id}: ${result.messageCount} messages\n`
|
||||||
|
);
|
||||||
successCount++;
|
successCount++;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
process.stderr.write(`[TranscriptParser] Failed to process session ${session.id}: ${error}\n`);
|
process.stderr.write(
|
||||||
|
`[TranscriptParser] Failed to process session ${session.id}: ${error}\n`
|
||||||
|
);
|
||||||
errorCount++;
|
errorCount++;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
process.stdout.write(`[TranscriptParser] Completed processing. Success: ${successCount}, Errors: ${errorCount}\n`);
|
process.stdout.write(
|
||||||
|
`[TranscriptParser] Completed processing. Success: ${successCount}, Errors: ${errorCount}\n`
|
||||||
|
);
|
||||||
return { processed: successCount, errors: errorCount };
|
return { processed: successCount, errors: errorCount };
|
||||||
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
process.stderr.write(`[TranscriptParser] Error in processAllUnparsedTranscripts: ${error}\n`);
|
process.stderr.write(
|
||||||
|
`[TranscriptParser] Error in processAllUnparsedTranscripts: ${error}\n`
|
||||||
|
);
|
||||||
throw error;
|
throw error;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -198,12 +231,14 @@ export async function getMessagesForSession(sessionId) {
|
|||||||
try {
|
try {
|
||||||
const messages = await prisma.message.findMany({
|
const messages = await prisma.message.findMany({
|
||||||
where: { sessionId },
|
where: { sessionId },
|
||||||
orderBy: { order: 'asc' }
|
orderBy: { order: "asc" },
|
||||||
});
|
});
|
||||||
|
|
||||||
return messages;
|
return messages;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
process.stderr.write(`[TranscriptParser] Error getting messages for session ${sessionId}: ${error}\n`);
|
process.stderr.write(
|
||||||
|
`[TranscriptParser] Error getting messages for session ${sessionId}: ${error}\n`
|
||||||
|
);
|
||||||
throw error;
|
throw error;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@ -21,9 +21,9 @@ export default async function handler(
|
|||||||
where: { id },
|
where: { id },
|
||||||
include: {
|
include: {
|
||||||
messages: {
|
messages: {
|
||||||
orderBy: { order: 'asc' }
|
orderBy: { order: "asc" },
|
||||||
}
|
},
|
||||||
}
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
if (!prismaSession) {
|
if (!prismaSession) {
|
||||||
@ -63,15 +63,16 @@ export default async function handler(
|
|||||||
processed: prismaSession.processed ?? null, // New field
|
processed: prismaSession.processed ?? null, // New field
|
||||||
questions: prismaSession.questions ?? null, // New field
|
questions: prismaSession.questions ?? null, // New field
|
||||||
summary: prismaSession.summary ?? null, // New field
|
summary: prismaSession.summary ?? null, // New field
|
||||||
messages: prismaSession.messages?.map(msg => ({
|
messages:
|
||||||
id: msg.id,
|
prismaSession.messages?.map((msg) => ({
|
||||||
sessionId: msg.sessionId,
|
id: msg.id,
|
||||||
timestamp: new Date(msg.timestamp),
|
sessionId: msg.sessionId,
|
||||||
role: msg.role,
|
timestamp: new Date(msg.timestamp),
|
||||||
content: msg.content,
|
role: msg.role,
|
||||||
order: msg.order,
|
content: msg.content,
|
||||||
createdAt: new Date(msg.createdAt)
|
order: msg.order,
|
||||||
})) ?? [], // New field - parsed messages
|
createdAt: new Date(msg.createdAt),
|
||||||
|
})) ?? [], // New field - parsed messages
|
||||||
};
|
};
|
||||||
|
|
||||||
return res.status(200).json({ session });
|
return res.status(200).json({ session });
|
||||||
|
|||||||
@ -48,7 +48,7 @@ async function triggerProcessingScheduler() {
|
|||||||
});
|
});
|
||||||
|
|
||||||
console.log(`Found ${sessionsToProcess.length} sessions to process:`);
|
console.log(`Found ${sessionsToProcess.length} sessions to process:`);
|
||||||
sessionsToProcess.forEach(session => {
|
sessionsToProcess.forEach((session) => {
|
||||||
console.log(`- Session ${session.id}: processed=${session.processed}`);
|
console.log(`- Session ${session.id}: processed=${session.processed}`);
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -58,7 +58,9 @@ async function triggerProcessingScheduler() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Import and run the processing function
|
// Import and run the processing function
|
||||||
const { processUnprocessedSessions } = await import("../lib/processingScheduler.js");
|
const { processUnprocessedSessions } = await import(
|
||||||
|
"../lib/processingScheduler.js"
|
||||||
|
);
|
||||||
await processUnprocessedSessions();
|
await processUnprocessedSessions();
|
||||||
|
|
||||||
console.log("✅ Processing scheduler completed");
|
console.log("✅ Processing scheduler completed");
|
||||||
@ -74,7 +76,9 @@ async function triggerTranscriptParsing() {
|
|||||||
console.log("=== Manual Transcript Parsing Trigger ===");
|
console.log("=== Manual Transcript Parsing Trigger ===");
|
||||||
try {
|
try {
|
||||||
const result = await processAllUnparsedTranscripts();
|
const result = await processAllUnparsedTranscripts();
|
||||||
console.log(`✅ Transcript parsing completed: ${result.processed} processed, ${result.errors} errors`);
|
console.log(
|
||||||
|
`✅ Transcript parsing completed: ${result.processed} processed, ${result.errors} errors`
|
||||||
|
);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("❌ Transcript parsing failed:", error);
|
console.error("❌ Transcript parsing failed:", error);
|
||||||
}
|
}
|
||||||
@ -89,25 +93,22 @@ async function showProcessingStatus() {
|
|||||||
try {
|
try {
|
||||||
const totalSessions = await prisma.session.count();
|
const totalSessions = await prisma.session.count();
|
||||||
const processedSessions = await prisma.session.count({
|
const processedSessions = await prisma.session.count({
|
||||||
where: { processed: true }
|
where: { processed: true },
|
||||||
});
|
});
|
||||||
const unprocessedSessions = await prisma.session.count({
|
const unprocessedSessions = await prisma.session.count({
|
||||||
where: { processed: { not: true } }
|
where: { processed: { not: true } },
|
||||||
});
|
});
|
||||||
const withMessages = await prisma.session.count({
|
const withMessages = await prisma.session.count({
|
||||||
where: {
|
where: {
|
||||||
messages: {
|
messages: {
|
||||||
some: {}
|
some: {},
|
||||||
}
|
},
|
||||||
}
|
},
|
||||||
});
|
});
|
||||||
const readyForProcessing = await prisma.session.count({
|
const readyForProcessing = await prisma.session.count({
|
||||||
where: {
|
where: {
|
||||||
AND: [
|
AND: [{ messages: { some: {} } }, { processed: { not: true } }],
|
||||||
{ messages: { some: {} } },
|
},
|
||||||
{ processed: { not: true } }
|
|
||||||
]
|
|
||||||
}
|
|
||||||
});
|
});
|
||||||
|
|
||||||
console.log(`📊 Total sessions: ${totalSessions}`);
|
console.log(`📊 Total sessions: ${totalSessions}`);
|
||||||
@ -121,24 +122,22 @@ async function showProcessingStatus() {
|
|||||||
console.log("\n📋 Sample unprocessed sessions:");
|
console.log("\n📋 Sample unprocessed sessions:");
|
||||||
const samples = await prisma.session.findMany({
|
const samples = await prisma.session.findMany({
|
||||||
where: {
|
where: {
|
||||||
AND: [
|
AND: [{ messages: { some: {} } }, { processed: { not: true } }],
|
||||||
{ messages: { some: {} } },
|
|
||||||
{ processed: { not: true } }
|
|
||||||
]
|
|
||||||
},
|
},
|
||||||
select: {
|
select: {
|
||||||
id: true,
|
id: true,
|
||||||
processed: true,
|
processed: true,
|
||||||
startTime: true,
|
startTime: true,
|
||||||
},
|
},
|
||||||
take: 3
|
take: 3,
|
||||||
});
|
});
|
||||||
|
|
||||||
samples.forEach(session => {
|
samples.forEach((session) => {
|
||||||
console.log(`- ${session.id} (${session.startTime.toISOString()}) - processed: ${session.processed}`);
|
console.log(
|
||||||
|
`- ${session.id} (${session.startTime.toISOString()}) - processed: ${session.processed}`
|
||||||
|
);
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("❌ Failed to get processing status:", error);
|
console.error("❌ Failed to get processing status:", error);
|
||||||
}
|
}
|
||||||
@ -148,24 +147,24 @@ async function showProcessingStatus() {
|
|||||||
const command = process.argv[2];
|
const command = process.argv[2];
|
||||||
|
|
||||||
switch (command) {
|
switch (command) {
|
||||||
case 'refresh':
|
case "refresh":
|
||||||
await triggerSessionRefresh();
|
await triggerSessionRefresh();
|
||||||
break;
|
break;
|
||||||
case 'process':
|
case "process":
|
||||||
await triggerProcessingScheduler();
|
await triggerProcessingScheduler();
|
||||||
break;
|
break;
|
||||||
case 'parse':
|
case "parse":
|
||||||
await triggerTranscriptParsing();
|
await triggerTranscriptParsing();
|
||||||
break;
|
break;
|
||||||
case 'status':
|
case "status":
|
||||||
await showProcessingStatus();
|
await showProcessingStatus();
|
||||||
break;
|
break;
|
||||||
case 'both':
|
case "both":
|
||||||
await triggerSessionRefresh();
|
await triggerSessionRefresh();
|
||||||
console.log("\n" + "=".repeat(50) + "\n");
|
console.log("\n" + "=".repeat(50) + "\n");
|
||||||
await triggerProcessingScheduler();
|
await triggerProcessingScheduler();
|
||||||
break;
|
break;
|
||||||
case 'all':
|
case "all":
|
||||||
await triggerSessionRefresh();
|
await triggerSessionRefresh();
|
||||||
console.log("\n" + "=".repeat(50) + "\n");
|
console.log("\n" + "=".repeat(50) + "\n");
|
||||||
await triggerTranscriptParsing();
|
await triggerTranscriptParsing();
|
||||||
@ -175,9 +174,13 @@ switch (command) {
|
|||||||
default:
|
default:
|
||||||
console.log("Usage: node scripts/manual-triggers.js [command]");
|
console.log("Usage: node scripts/manual-triggers.js [command]");
|
||||||
console.log("Commands:");
|
console.log("Commands:");
|
||||||
console.log(" refresh - Trigger session refresh (fetch new sessions from CSV)");
|
console.log(
|
||||||
|
" refresh - Trigger session refresh (fetch new sessions from CSV)"
|
||||||
|
);
|
||||||
console.log(" parse - Parse transcripts into structured messages");
|
console.log(" parse - Parse transcripts into structured messages");
|
||||||
console.log(" process - Trigger processing scheduler (process unprocessed sessions)");
|
console.log(
|
||||||
|
" process - Trigger processing scheduler (process unprocessed sessions)"
|
||||||
|
);
|
||||||
console.log(" status - Show current processing status");
|
console.log(" status - Show current processing status");
|
||||||
console.log(" both - Run both refresh and processing");
|
console.log(" both - Run both refresh and processing");
|
||||||
console.log(" all - Run refresh, parse, and processing in sequence");
|
console.log(" all - Run refresh, parse, and processing in sequence");
|
||||||
|
|||||||
@ -125,7 +125,9 @@ function validateOpenAIResponse(data) {
|
|||||||
|
|
||||||
// Validate field types
|
// Validate field types
|
||||||
if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) {
|
if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) {
|
||||||
throw new Error("Invalid language format. Expected ISO 639-1 code (e.g., 'en')");
|
throw new Error(
|
||||||
|
"Invalid language format. Expected ISO 639-1 code (e.g., 'en')"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.messages_sent !== "number" || data.messages_sent < 0) {
|
if (typeof data.messages_sent !== "number" || data.messages_sent < 0) {
|
||||||
@ -133,7 +135,9 @@ function validateOpenAIResponse(data) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (!["positive", "neutral", "negative"].includes(data.sentiment)) {
|
if (!["positive", "neutral", "negative"].includes(data.sentiment)) {
|
||||||
throw new Error("Invalid sentiment. Expected 'positive', 'neutral', or 'negative'");
|
throw new Error(
|
||||||
|
"Invalid sentiment. Expected 'positive', 'neutral', or 'negative'"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.escalated !== "boolean") {
|
if (typeof data.escalated !== "boolean") {
|
||||||
@ -161,15 +165,23 @@ function validateOpenAIResponse(data) {
|
|||||||
];
|
];
|
||||||
|
|
||||||
if (!validCategories.includes(data.category)) {
|
if (!validCategories.includes(data.category)) {
|
||||||
throw new Error(`Invalid category. Expected one of: ${validCategories.join(", ")}`);
|
throw new Error(
|
||||||
|
`Invalid category. Expected one of: ${validCategories.join(", ")}`
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!Array.isArray(data.questions)) {
|
if (!Array.isArray(data.questions)) {
|
||||||
throw new Error("Invalid questions. Expected array of strings");
|
throw new Error("Invalid questions. Expected array of strings");
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.summary !== "string" || data.summary.length < 10 || data.summary.length > 300) {
|
if (
|
||||||
throw new Error("Invalid summary. Expected string between 10-300 characters");
|
typeof data.summary !== "string" ||
|
||||||
|
data.summary.length < 10 ||
|
||||||
|
data.summary.length > 300
|
||||||
|
) {
|
||||||
|
throw new Error(
|
||||||
|
"Invalid summary. Expected string between 10-300 characters"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.session_id !== "string") {
|
if (typeof data.session_id !== "string") {
|
||||||
@ -210,7 +222,9 @@ async function processUnprocessedSessions() {
|
|||||||
for (const session of sessionsToProcess) {
|
for (const session of sessionsToProcess) {
|
||||||
if (!session.transcriptContent) {
|
if (!session.transcriptContent) {
|
||||||
// Should not happen due to query, but good for type safety
|
// Should not happen due to query, but good for type safety
|
||||||
console.warn(`Session ${session.id} has no transcript content, skipping.`);
|
console.warn(
|
||||||
|
`Session ${session.id} has no transcript content, skipping.`
|
||||||
|
);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@ -101,7 +101,7 @@ async function processTranscriptWithOpenAI(
|
|||||||
throw new Error(`OpenAI API error: ${response.status} - ${errorText}`);
|
throw new Error(`OpenAI API error: ${response.status} - ${errorText}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
const data = await response.json() as any;
|
const data = (await response.json()) as any;
|
||||||
const processedData = JSON.parse(data.choices[0].message.content);
|
const processedData = JSON.parse(data.choices[0].message.content);
|
||||||
|
|
||||||
// Validate the response against our expected schema
|
// Validate the response against our expected schema
|
||||||
@ -118,7 +118,9 @@ async function processTranscriptWithOpenAI(
|
|||||||
* Validates the OpenAI response against our expected schema
|
* Validates the OpenAI response against our expected schema
|
||||||
* @param data The data to validate
|
* @param data The data to validate
|
||||||
*/
|
*/
|
||||||
function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData {
|
function validateOpenAIResponse(
|
||||||
|
data: any
|
||||||
|
): asserts data is OpenAIProcessedData {
|
||||||
// Check required fields
|
// Check required fields
|
||||||
const requiredFields = [
|
const requiredFields = [
|
||||||
"language",
|
"language",
|
||||||
@ -140,7 +142,9 @@ function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData
|
|||||||
|
|
||||||
// Validate field types
|
// Validate field types
|
||||||
if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) {
|
if (typeof data.language !== "string" || !/^[a-z]{2}$/.test(data.language)) {
|
||||||
throw new Error("Invalid language format. Expected ISO 639-1 code (e.g., 'en')");
|
throw new Error(
|
||||||
|
"Invalid language format. Expected ISO 639-1 code (e.g., 'en')"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.messages_sent !== "number" || data.messages_sent < 0) {
|
if (typeof data.messages_sent !== "number" || data.messages_sent < 0) {
|
||||||
@ -148,7 +152,9 @@ function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (!["positive", "neutral", "negative"].includes(data.sentiment)) {
|
if (!["positive", "neutral", "negative"].includes(data.sentiment)) {
|
||||||
throw new Error("Invalid sentiment. Expected 'positive', 'neutral', or 'negative'");
|
throw new Error(
|
||||||
|
"Invalid sentiment. Expected 'positive', 'neutral', or 'negative'"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.escalated !== "boolean") {
|
if (typeof data.escalated !== "boolean") {
|
||||||
@ -176,15 +182,23 @@ function validateOpenAIResponse(data: any): asserts data is OpenAIProcessedData
|
|||||||
];
|
];
|
||||||
|
|
||||||
if (!validCategories.includes(data.category)) {
|
if (!validCategories.includes(data.category)) {
|
||||||
throw new Error(`Invalid category. Expected one of: ${validCategories.join(", ")}`);
|
throw new Error(
|
||||||
|
`Invalid category. Expected one of: ${validCategories.join(", ")}`
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!Array.isArray(data.questions)) {
|
if (!Array.isArray(data.questions)) {
|
||||||
throw new Error("Invalid questions. Expected array of strings");
|
throw new Error("Invalid questions. Expected array of strings");
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.summary !== "string" || data.summary.length < 10 || data.summary.length > 300) {
|
if (
|
||||||
throw new Error("Invalid summary. Expected string between 10-300 characters");
|
typeof data.summary !== "string" ||
|
||||||
|
data.summary.length < 10 ||
|
||||||
|
data.summary.length > 300
|
||||||
|
) {
|
||||||
|
throw new Error(
|
||||||
|
"Invalid summary. Expected string between 10-300 characters"
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (typeof data.session_id !== "string") {
|
if (typeof data.session_id !== "string") {
|
||||||
@ -225,7 +239,9 @@ async function processUnprocessedSessions() {
|
|||||||
for (const session of sessionsToProcess) {
|
for (const session of sessionsToProcess) {
|
||||||
if (!session.transcriptContent) {
|
if (!session.transcriptContent) {
|
||||||
// Should not happen due to query, but good for type safety
|
// Should not happen due to query, but good for type safety
|
||||||
console.warn(`Session ${session.id} has no transcript content, skipping.`);
|
console.warn(
|
||||||
|
`Session ${session.id} has no transcript content, skipping.`
|
||||||
|
);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
22
server.js
22
server.js
@ -1,12 +1,12 @@
|
|||||||
// Custom Next.js server with scheduler initialization
|
// Custom Next.js server with scheduler initialization
|
||||||
const { createServer } = require('http');
|
const { createServer } = require("http");
|
||||||
const { parse } = require('url');
|
const { parse } = require("url");
|
||||||
const next = require('next');
|
const next = require("next");
|
||||||
const { startScheduler } = require('./lib/scheduler');
|
const { startScheduler } = require("./lib/scheduler");
|
||||||
const { startProcessingScheduler } = require('./lib/processingScheduler');
|
const { startProcessingScheduler } = require("./lib/processingScheduler");
|
||||||
|
|
||||||
const dev = process.env.NODE_ENV !== 'production';
|
const dev = process.env.NODE_ENV !== "production";
|
||||||
const hostname = 'localhost';
|
const hostname = "localhost";
|
||||||
const port = process.env.PORT || 3000;
|
const port = process.env.PORT || 3000;
|
||||||
|
|
||||||
// Initialize Next.js
|
// Initialize Next.js
|
||||||
@ -15,10 +15,10 @@ const handle = app.getRequestHandler();
|
|||||||
|
|
||||||
app.prepare().then(() => {
|
app.prepare().then(() => {
|
||||||
// Initialize schedulers when the server starts
|
// Initialize schedulers when the server starts
|
||||||
console.log('Starting schedulers...');
|
console.log("Starting schedulers...");
|
||||||
startScheduler();
|
startScheduler();
|
||||||
startProcessingScheduler();
|
startProcessingScheduler();
|
||||||
console.log('All schedulers initialized successfully');
|
console.log("All schedulers initialized successfully");
|
||||||
|
|
||||||
createServer(async (req, res) => {
|
createServer(async (req, res) => {
|
||||||
try {
|
try {
|
||||||
@ -28,9 +28,9 @@ app.prepare().then(() => {
|
|||||||
// Let Next.js handle the request
|
// Let Next.js handle the request
|
||||||
await handle(req, res, parsedUrl);
|
await handle(req, res, parsedUrl);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error('Error occurred handling', req.url, err);
|
console.error("Error occurred handling", req.url, err);
|
||||||
res.statusCode = 500;
|
res.statusCode = 500;
|
||||||
res.end('Internal Server Error');
|
res.end("Internal Server Error");
|
||||||
}
|
}
|
||||||
}).listen(port, (err) => {
|
}).listen(port, (err) => {
|
||||||
if (err) throw err;
|
if (err) throw err;
|
||||||
|
|||||||
28
server.mjs
28
server.mjs
@ -1,15 +1,15 @@
|
|||||||
// Custom Next.js server with scheduler initialization
|
// Custom Next.js server with scheduler initialization
|
||||||
import { createServer } from 'http';
|
import { createServer } from "http";
|
||||||
import { parse } from 'url';
|
import { parse } from "url";
|
||||||
import next from 'next';
|
import next from "next";
|
||||||
|
|
||||||
// We'll need to dynamically import these after they're compiled
|
// We'll need to dynamically import these after they're compiled
|
||||||
let startScheduler;
|
let startScheduler;
|
||||||
let startProcessingScheduler;
|
let startProcessingScheduler;
|
||||||
|
|
||||||
const dev = process.env.NODE_ENV !== 'production';
|
const dev = process.env.NODE_ENV !== "production";
|
||||||
const hostname = 'localhost';
|
const hostname = "localhost";
|
||||||
const port = parseInt(process.env.PORT || '3000', 10);
|
const port = parseInt(process.env.PORT || "3000", 10);
|
||||||
|
|
||||||
// Initialize Next.js
|
// Initialize Next.js
|
||||||
const app = next({ dev, hostname, port });
|
const app = next({ dev, hostname, port });
|
||||||
@ -18,37 +18,37 @@ const handle = app.getRequestHandler();
|
|||||||
async function init() {
|
async function init() {
|
||||||
try {
|
try {
|
||||||
// Dynamically import the schedulers
|
// Dynamically import the schedulers
|
||||||
const scheduler = await import('./lib/scheduler.js');
|
const scheduler = await import("./lib/scheduler.js");
|
||||||
const processingScheduler = await import('./lib/processingScheduler.js');
|
const processingScheduler = await import("./lib/processingScheduler.js");
|
||||||
|
|
||||||
startScheduler = scheduler.startScheduler;
|
startScheduler = scheduler.startScheduler;
|
||||||
startProcessingScheduler = processingScheduler.startProcessingScheduler;
|
startProcessingScheduler = processingScheduler.startProcessingScheduler;
|
||||||
|
|
||||||
app.prepare().then(() => {
|
app.prepare().then(() => {
|
||||||
// Initialize schedulers when the server starts
|
// Initialize schedulers when the server starts
|
||||||
console.log('Starting schedulers...');
|
console.log("Starting schedulers...");
|
||||||
startScheduler();
|
startScheduler();
|
||||||
startProcessingScheduler();
|
startProcessingScheduler();
|
||||||
console.log('All schedulers initialized successfully');
|
console.log("All schedulers initialized successfully");
|
||||||
|
|
||||||
createServer(async (req, res) => {
|
createServer(async (req, res) => {
|
||||||
try {
|
try {
|
||||||
// Parse the URL
|
// Parse the URL
|
||||||
const parsedUrl = parse(req.url || '', true);
|
const parsedUrl = parse(req.url || "", true);
|
||||||
|
|
||||||
// Let Next.js handle the request
|
// Let Next.js handle the request
|
||||||
await handle(req, res, parsedUrl);
|
await handle(req, res, parsedUrl);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error('Error occurred handling', req.url, err);
|
console.error("Error occurred handling", req.url, err);
|
||||||
res.statusCode = 500;
|
res.statusCode = 500;
|
||||||
res.end('Internal Server Error');
|
res.end("Internal Server Error");
|
||||||
}
|
}
|
||||||
}).listen(port, () => {
|
}).listen(port, () => {
|
||||||
console.log(`> Ready on http://${hostname}:${port}`);
|
console.log(`> Ready on http://${hostname}:${port}`);
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Failed to initialize server:', error);
|
console.error("Failed to initialize server:", error);
|
||||||
process.exit(1);
|
process.exit(1);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
26
server.ts
26
server.ts
@ -1,13 +1,13 @@
|
|||||||
// Custom Next.js server with scheduler initialization
|
// Custom Next.js server with scheduler initialization
|
||||||
import { createServer } from 'http';
|
import { createServer } from "http";
|
||||||
import { parse } from 'url';
|
import { parse } from "url";
|
||||||
import next from 'next';
|
import next from "next";
|
||||||
import { startScheduler } from './lib/scheduler.js';
|
import { startScheduler } from "./lib/scheduler.js";
|
||||||
import { startProcessingScheduler } from './lib/processingScheduler.js';
|
import { startProcessingScheduler } from "./lib/processingScheduler.js";
|
||||||
|
|
||||||
const dev = process.env.NODE_ENV !== 'production';
|
const dev = process.env.NODE_ENV !== "production";
|
||||||
const hostname = 'localhost';
|
const hostname = "localhost";
|
||||||
const port = parseInt(process.env.PORT || '3000', 10);
|
const port = parseInt(process.env.PORT || "3000", 10);
|
||||||
|
|
||||||
// Initialize Next.js
|
// Initialize Next.js
|
||||||
const app = next({ dev, hostname, port });
|
const app = next({ dev, hostname, port });
|
||||||
@ -15,22 +15,22 @@ const handle = app.getRequestHandler();
|
|||||||
|
|
||||||
app.prepare().then(() => {
|
app.prepare().then(() => {
|
||||||
// Initialize schedulers when the server starts
|
// Initialize schedulers when the server starts
|
||||||
console.log('Starting schedulers...');
|
console.log("Starting schedulers...");
|
||||||
startScheduler();
|
startScheduler();
|
||||||
startProcessingScheduler();
|
startProcessingScheduler();
|
||||||
console.log('All schedulers initialized successfully');
|
console.log("All schedulers initialized successfully");
|
||||||
|
|
||||||
createServer(async (req, res) => {
|
createServer(async (req, res) => {
|
||||||
try {
|
try {
|
||||||
// Parse the URL
|
// Parse the URL
|
||||||
const parsedUrl = parse(req.url || '', true);
|
const parsedUrl = parse(req.url || "", true);
|
||||||
|
|
||||||
// Let Next.js handle the request
|
// Let Next.js handle the request
|
||||||
await handle(req, res, parsedUrl);
|
await handle(req, res, parsedUrl);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error('Error occurred handling', req.url, err);
|
console.error("Error occurred handling", req.url, err);
|
||||||
res.statusCode = 500;
|
res.statusCode = 500;
|
||||||
res.end('Internal Server Error');
|
res.end("Internal Server Error");
|
||||||
}
|
}
|
||||||
}).listen(port, () => {
|
}).listen(port, () => {
|
||||||
console.log(`> Ready on http://${hostname}:${port}`);
|
console.log(`> Ready on http://${hostname}:${port}`);
|
||||||
|
|||||||
Reference in New Issue
Block a user