Compare commits
32 Commits
f68e57ce3b
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 16eb688607 | |||
| 2132c130a3 | |||
| dffbc31432 | |||
| 151ecaa98f | |||
| b917aa5077 | |||
| 24f5f91704 | |||
| 00c9584d03 | |||
| c94a3b41c9 | |||
| e25cdc4441 | |||
| 1f95d57717 | |||
| d2fb9b8071 | |||
| 08abd96751 | |||
| eb570679ba | |||
| 8713ed9686 | |||
| 595076033b | |||
| 0c874575d4 | |||
| 71c293320b | |||
| 46805fb196 | |||
| 51f2679732 | |||
| 6b0f936f40 | |||
| 0eb2a457f7 | |||
| 0cf0bad6b1 | |||
| c7d9acead0 | |||
| 193127b86a | |||
| bf2beda390 | |||
| 942da18d85 | |||
| a3fa8f9b91 | |||
| 0e3b3933f0 | |||
| 445c07a8d8 | |||
| 3f8e62fbb8 | |||
| 248106a239 | |||
| 587d4d66f8 |
3
.claude/commands/context.md
Normal file
3
.claude/commands/context.md
Normal file
@@ -0,0 +1,3 @@
|
||||
Read the project context file at `.claude/project-context.md` to quickly understand the DiunaBI project structure, architecture, key components, and recent development focus. This will bootstrap your knowledge without needing to explore the entire codebase.
|
||||
|
||||
After reading the context file, briefly acknowledge what you've learned and ask the user what they need help with.
|
||||
27
.claude/commands/updateContext.md
Normal file
27
.claude/commands/updateContext.md
Normal file
@@ -0,0 +1,27 @@
|
||||
Update the `.claude/project-context.md` file by ONLY appending changes made during THIS session to the "RECENT CHANGES (This Session)" section at the top of the file.
|
||||
|
||||
**DO NOT re-scan or re-explore the entire codebase** - this wastes tokens and time.
|
||||
|
||||
**What to do:**
|
||||
|
||||
1. Review the conversation history to identify what was changed/added/fixed in THIS session
|
||||
2. Read the current `.claude/project-context.md` file
|
||||
3. Update ONLY the "RECENT CHANGES (This Session)" section at the top with:
|
||||
- Date of changes (today's date)
|
||||
- Brief bullet points describing what was modified
|
||||
- Files that were changed with brief descriptions
|
||||
- Any new functionality added
|
||||
- Bug fixes completed
|
||||
4. Leave the rest of the file unchanged
|
||||
|
||||
**Format for session changes:**
|
||||
```markdown
|
||||
## RECENT CHANGES (This Session)
|
||||
|
||||
**[Feature/Fix Name] ([Date]):**
|
||||
- ✅ Brief description of change 1
|
||||
- ✅ Brief description of change 2
|
||||
- Files modified: [file1.cs](path/to/file1.cs), [file2.cs](path/to/file2.cs)
|
||||
```
|
||||
|
||||
When done, provide a brief summary of what session changes were documented.
|
||||
819
.claude/project-context.md
Normal file
819
.claude/project-context.md
Normal file
@@ -0,0 +1,819 @@
|
||||
# DiunaBI Project Context
|
||||
|
||||
> This file is auto-generated for Claude Code to quickly understand the project structure.
|
||||
> Last updated: 2025-12-08
|
||||
|
||||
## RECENT CHANGES (This Session)
|
||||
|
||||
**SignalR Real-Time Updates & UI Consistency (Dec 8, 2025):**
|
||||
- ✅ **Removed Manual Refresh Button** - Removed refresh button from Jobs/Index.razor (SignalR auto-refresh eliminates need)
|
||||
- ✅ **SignalR on Layers List** - Added real-time updates to Layers/Index with EntityChangeHubService subscription
|
||||
- ✅ **SignalR on DataInbox List** - Added real-time updates to DataInbox/Index with EntityChangeHubService subscription
|
||||
- ✅ **SignalR on Layer Details** - Added real-time updates to Layers/Details for both layer and record changes
|
||||
- ✅ **Consistent UI Behavior** - All lists now have uniform SignalR-based real-time updates
|
||||
- ✅ **Proper Cleanup** - Implemented IDisposable pattern to unsubscribe from SignalR events on all pages
|
||||
- ✅ **Jobs Sorting Fix** - Changed sorting from Priority→JobType→CreatedAt DESC to CreatedAt DESC→Priority ASC (newest jobs first, then by priority)
|
||||
- ✅ **Faster Job Processing** - Reduced JobWorkerService poll interval from 10 seconds to 5 seconds
|
||||
- Files modified:
|
||||
- [Jobs/Index.razor](DiunaBI.UI.Shared/Pages/Jobs/Index.razor) - removed refresh button
|
||||
- [Layers/Index.razor](DiunaBI.UI.Shared/Pages/Layers/Index.razor), [Layers/Index.razor.cs](DiunaBI.UI.Shared/Pages/Layers/Index.razor.cs) - added SignalR + IDisposable
|
||||
- [DataInbox/Index.razor](DiunaBI.UI.Shared/Pages/DataInbox/Index.razor), [DataInbox/Index.razor.cs](DiunaBI.UI.Shared/Pages/DataInbox/Index.razor.cs) - added SignalR + IDisposable
|
||||
- [Layers/Details.razor](DiunaBI.UI.Shared/Pages/Layers/Details.razor), [Layers/Details.razor.cs](DiunaBI.UI.Shared/Pages/Layers/Details.razor.cs) - added SignalR + IDisposable
|
||||
- [JobsController.cs](DiunaBI.API/Controllers/JobsController.cs) - fixed sorting logic
|
||||
- [JobWorkerService.cs](DiunaBI.Infrastructure/Services/JobWorkerService.cs) - reduced poll interval to 5 seconds
|
||||
- Status: All lists have consistent real-time behavior, no manual refresh needed, jobs sorted by date first
|
||||
|
||||
---
|
||||
|
||||
**Job Scheduler Race Condition Fix (Dec 8, 2025):**
|
||||
- ✅ **In-Memory Deduplication** - Added `HashSet<Guid>` to track LayerIds scheduled within the same batch
|
||||
- ✅ **Prevents Duplicate Jobs** - Fixed race condition where same layer could be scheduled multiple times during single "Run All Jobs" operation
|
||||
- ✅ **Two-Level Protection** - In-memory check (HashSet) runs before database check for O(1) performance
|
||||
- ✅ **Applied to Both Methods** - Fixed both ScheduleImportJobsAsync and ScheduleProcessJobsAsync
|
||||
- ✅ **Better Logging** - Added debug log message "Job already scheduled in this batch" for transparency
|
||||
- Root cause: When multiple layers had same ID in query results or import plugins created new layers during scheduling loop, database check couldn't detect duplicates added in same batch before SaveChangesAsync()
|
||||
- Solution: Track scheduled LayerIds in HashSet during loop iteration to prevent within-batch duplicates
|
||||
- Files modified: [JobSchedulerService.cs](DiunaBI.Infrastructure/Services/JobSchedulerService.cs)
|
||||
- Status: Race condition resolved, duplicate job creation prevented
|
||||
|
||||
---
|
||||
|
||||
**Blazor Server Reconnection UI Customization (Dec 8, 2025):**
|
||||
- ✅ **Custom Reconnection Modal** - Replaced default Blazor "Rejoin failed..." dialog with custom-styled modal
|
||||
- ✅ **Theme-Matched Styling** - Changed loader and button colors from blue to app's primary red (#e7163d) matching navbar
|
||||
- ✅ **Timer with Elapsed Seconds** - Added real-time timer showing elapsed reconnection time (0s, 1s, 2s...)
|
||||
- ✅ **CSS Classes Integration** - Used Blazor's built-in `.components-reconnect-show/failed/rejected` classes for state management
|
||||
- ✅ **MutationObserver Timer** - JavaScript watches for CSS class changes to start/stop elapsed time counter
|
||||
- ✅ **Professional Design** - Modal backdrop blur, spinner animation, red reload button with hover effects
|
||||
- Files modified: [App.razor](DiunaBI.UI.Web/Components/App.razor), [app.css](DiunaBI.UI.Web/wwwroot/app.css)
|
||||
- Files created: [reconnect.js](DiunaBI.UI.Web/wwwroot/js/reconnect.js)
|
||||
- Status: Blazor reconnection UI now matches app theme with timer indicator
|
||||
|
||||
**Jobs List Sorting and Multi-Select Filtering (Dec 8, 2025):**
|
||||
- ✅ **Fixed Job Sorting** - Changed from single CreatedAt DESC to Priority ASC → JobType → CreatedAt DESC
|
||||
- ✅ **Multi-Select Status Filter** - Replaced single status dropdown with multi-select supporting multiple JobStatus values
|
||||
- ✅ **Auto-Refresh on Filter Change** - Filters now automatically trigger data reload without requiring manual button click
|
||||
- ✅ **API Updates** - JobsController GetAll endpoint accepts `List<JobStatus>? statuses` instead of single status
|
||||
- ✅ **JobService Updates** - Sends status values as integers in query string for multi-select support
|
||||
- Files modified: [JobsController.cs](DiunaBI.API/Controllers/JobsController.cs), [JobService.cs](DiunaBI.UI.Shared/Services/JobService.cs), [Index.razor](DiunaBI.UI.Shared/Pages/Jobs/Index.razor), [Index.razor.cs](DiunaBI.UI.Shared/Pages/Jobs/Index.razor.cs)
|
||||
- Status: Jobs list now sortable by priority/type/date with working multi-select filters
|
||||
|
||||
**User Timezone Support (Dec 8, 2025):**
|
||||
- ✅ **DateTimeHelper Service** - Created JS Interop service to detect user's browser timezone
|
||||
- ✅ **UTC to Local Conversion** - All date displays now show user's local timezone instead of UTC
|
||||
- ✅ **Database Consistency** - Database continues to store UTC (correct), conversion only for display
|
||||
- ✅ **Updated Pages** - Applied timezone conversion to all date fields in:
|
||||
- Jobs Index and Details pages
|
||||
- Layers Details page (CreatedAt, ModifiedAt, record history)
|
||||
- DataInbox Index page
|
||||
- ✅ **Service Registration** - Registered DateTimeHelper as scoped service in DI container
|
||||
- Files created: [DateTimeHelper.cs](DiunaBI.UI.Shared/Services/DateTimeHelper.cs)
|
||||
- Files modified: [ServiceCollectionExtensions.cs](DiunaBI.UI.Shared/Extensions/ServiceCollectionExtensions.cs), [Jobs/Index.razor.cs](DiunaBI.UI.Shared/Pages/Jobs/Index.razor.cs), [Jobs/Details.razor](DiunaBI.UI.Shared/Pages/Jobs/Details.razor), [Layers/Details.razor](DiunaBI.UI.Shared/Pages/Layers/Details.razor), [Layers/Details.razor.cs](DiunaBI.UI.Shared/Pages/Layers/Details.razor.cs), [DataInbox/Index.razor.cs](DiunaBI.UI.Shared/Pages/DataInbox/Index.razor.cs)
|
||||
- Status: All dates display in user's local timezone with format "yyyy-MM-dd HH:mm:ss"
|
||||
|
||||
**QueueJob Model Cleanup and AutoImport User (Dec 8, 2025):**
|
||||
- ✅ **Removed Duplicate Fields** - Removed CreatedAtUtc and ModifiedAtUtc from QueueJob (were duplicates of CreatedAt/ModifiedAt)
|
||||
- ✅ **Added ModifiedAt Field** - Was missing, now tracks job modification timestamp
|
||||
- ✅ **AutoImport User ID** - Created User.AutoImportUserId constant: `f392209e-123e-4651-a5a4-0b1d6cf9ff9d`
|
||||
- ✅ **System Operations** - All system-created/modified jobs now use AutoImportUserId for CreatedById and ModifiedById
|
||||
- ✅ **Database Migration** - Created migration: RemoveQueueJobDuplicateUTCFields
|
||||
- Files modified: [QueueJob.cs](DiunaBI.Domain/Entities/QueueJob.cs), [User.cs](DiunaBI.Domain/Entities/User.cs), [JobWorkerService.cs](DiunaBI.Infrastructure/Services/JobWorkerService.cs), [JobSchedulerService.cs](DiunaBI.Infrastructure/Services/JobSchedulerService.cs), [AppDbContext.cs](DiunaBI.Infrastructure/Data/AppDbContext.cs), [JobsController.cs](DiunaBI.API/Controllers/JobsController.cs)
|
||||
- Files created: [20251208205202_RemoveQueueJobDuplicateUTCFields.cs](DiunaBI.Infrastructure/Migrations/20251208205202_RemoveQueueJobDuplicateUTCFields.cs)
|
||||
- Status: QueueJob model cleaned up, all automated operations tracked with AutoImport user ID
|
||||
|
||||
**Job Scheduling UI with JWT Authorization (Dec 8, 2025):**
|
||||
- ✅ **New JWT Endpoints** - Created UI-specific endpoints at `/jobs/ui/schedule/*` with JWT authorization (parallel to API key endpoints)
|
||||
- ✅ **Three Scheduling Options** - MudMenu dropdown in Jobs Index with:
|
||||
- Run All Jobs - schedules all import and process jobs
|
||||
- Run All Imports - schedules import jobs only
|
||||
- Run All Processes - schedules process jobs only
|
||||
- ✅ **JobService Methods** - Added three scheduling methods returning (success, jobsCreated, message) tuples
|
||||
- ✅ **Auto-Refresh** - Jobs list automatically reloads after scheduling with success/failure notifications
|
||||
- ✅ **Dual Authorization** - Existing `/jobs/schedule/{apiKey}` endpoints for automation, new `/jobs/ui/schedule` endpoints for UI users
|
||||
- Files modified: [JobsController.cs](DiunaBI.API/Controllers/JobsController.cs), [JobService.cs](DiunaBI.UI.Shared/Services/JobService.cs), [Index.razor](DiunaBI.UI.Shared/Pages/Jobs/Index.razor), [Index.razor.cs](DiunaBI.UI.Shared/Pages/Jobs/Index.razor.cs)
|
||||
- Status: UI users can now schedule jobs directly from Jobs page using JWT authentication
|
||||
|
||||
---
|
||||
|
||||
**API Key Authorization Fix for Cron Jobs (Dec 6, 2025):**
|
||||
- ✅ **Fixed 401 Unauthorized on API Key Endpoints** - Cron jobs calling `/jobs/schedule` endpoints were getting rejected despite valid API keys
|
||||
- ✅ **Added [AllowAnonymous] Attribute** - Bypasses controller-level `[Authorize]` to allow `[ApiKeyAuth]` filter to handle authorization
|
||||
- ✅ **Three Endpoints Fixed** - Applied fix to all job scheduling endpoints:
|
||||
- `POST /jobs/schedule` - Schedule all jobs (imports + processes)
|
||||
- `POST /jobs/schedule/imports` - Schedule import jobs only
|
||||
- `POST /jobs/schedule/processes` - Schedule process jobs only
|
||||
- Root cause: Controller-level `[Authorize]` attribute required JWT Bearer auth for all endpoints, blocking API key authentication
|
||||
- Solution: Add `[AllowAnonymous]` to allow `[ApiKeyAuth]` filter to validate X-API-Key header
|
||||
- Files modified: [JobsController.cs](DiunaBI.API/Controllers/JobsController.cs)
|
||||
- Status: Cron jobs can now authenticate with API key via X-API-Key header
|
||||
|
||||
**SignalR Authentication Token Flow Fix (Dec 6, 2025):**
|
||||
- ✅ **TokenProvider Population** - Fixed `TokenProvider.Token` never being set with JWT, causing 401 Unauthorized on SignalR connections
|
||||
- ✅ **AuthService Token Management** - Injected `TokenProvider` into `AuthService` and set token in 3 key places:
|
||||
- `ValidateWithBackendAsync()` - on fresh Google login
|
||||
- `CheckAuthenticationAsync()` - on session restore from localStorage
|
||||
- `ClearAuthenticationAsync()` - clear token on logout
|
||||
- ✅ **SignalR Initialization Timing** - Moved SignalR initialization from `MainLayout.OnInitializedAsync` to after authentication completes
|
||||
- ✅ **Event-Driven Architecture** - `MainLayout` now subscribes to `AuthenticationStateChanged` event to initialize SignalR when user authenticates
|
||||
- ✅ **Session Restore Support** - `CheckAuthenticationAsync()` now fires `AuthenticationStateChanged` event to initialize SignalR on page refresh
|
||||
- Root cause: SignalR was initialized before authentication, so JWT token was empty during connection setup
|
||||
- Solution: Initialize SignalR only after token is available via event subscription
|
||||
- Files modified: [AuthService.cs](DiunaBI.UI.Shared/Services/AuthService.cs), [MainLayout.razor](DiunaBI.UI.Shared/Components/Layout/MainLayout.razor)
|
||||
- Status: SignalR authentication working for both fresh login and restored sessions
|
||||
|
||||
**SignalR Authentication DI Fix (Dec 6, 2025):**
|
||||
- ✅ **TokenProvider Registration** - Added missing `TokenProvider` service registration in DI container
|
||||
- ✅ **EntityChangeHubService Scope Fix** - Changed from singleton to scoped to support user-specific JWT tokens
|
||||
- ✅ **Bug Fix** - Resolved `InvalidOperationException` preventing app from starting after SignalR authentication was added
|
||||
- Root cause: Singleton service (`EntityChangeHubService`) cannot depend on scoped service (`TokenProvider`) in DI
|
||||
- Solution: Made `EntityChangeHubService` scoped so each user session has its own authenticated SignalR connection
|
||||
- Files modified: [ServiceCollectionExtensions.cs](DiunaBI.UI.Shared/Extensions/ServiceCollectionExtensions.cs)
|
||||
|
||||
---
|
||||
|
||||
**Security Audit & Hardening (Dec 5, 2025):**
|
||||
- ✅ **JWT Token Validation** - Enabled issuer/audience validation in [Program.cs](DiunaBI.API/Program.cs), fixed config key mismatch in [JwtTokenService.cs](DiunaBI.API/Services/JwtTokenService.cs)
|
||||
- ✅ **API Key Security** - Created [ApiKeyAuthAttribute.cs](DiunaBI.API/Attributes/ApiKeyAuthAttribute.cs) with X-API-Key header auth, constant-time comparison
|
||||
- ✅ **Job Endpoints** - Migrated 3 job scheduling endpoints in [JobsController.cs](DiunaBI.API/Controllers/JobsController.cs) from URL-based to header-based API keys
|
||||
- ✅ **Stack Trace Exposure** - Fixed 20 instances across 3 controllers ([JobsController.cs](DiunaBI.API/Controllers/JobsController.cs), [LayersController.cs](DiunaBI.API/Controllers/LayersController.cs), [DataInboxController.cs](DiunaBI.API/Controllers/DataInboxController.cs)) - now returns generic error messages
|
||||
- ✅ **SignalR Authentication** - Added [Authorize] to [EntityChangeHub.cs](DiunaBI.API/Hubs/EntityChangeHub.cs), configured JWT token in [EntityChangeHubService.cs](DiunaBI.UI.Shared/Services/EntityChangeHubService.cs)
|
||||
- ✅ **Rate Limiting** - Implemented ASP.NET Core rate limiting: 100 req/min general, 10 req/min auth in [Program.cs](DiunaBI.API/Program.cs)
|
||||
- ✅ **Security Headers** - Added XSS, clickjacking, MIME sniffing protection middleware in [Program.cs](DiunaBI.API/Program.cs)
|
||||
- ✅ **Input Validation** - Added pagination limits (1-1000) to GetAll endpoints in 3 controllers
|
||||
- ✅ **User Enumeration** - Fixed generic auth error in [GoogleAuthService.cs](DiunaBI.API/Services/GoogleAuthService.cs)
|
||||
- ✅ **Sensitive Data Logging** - Made conditional on development only in [Program.cs](DiunaBI.API/Program.cs)
|
||||
- ✅ **Base64 Size Limit** - Added 10MB limit to DataInbox in [DataInboxController.cs](DiunaBI.API/Controllers/DataInboxController.cs)
|
||||
- Files modified: 12 files (API: Program.cs, 4 controllers, 3 services, 1 hub, 1 new attribute; UI: EntityChangeHubService.cs, ServiceCollectionExtensions.cs)
|
||||
- Security status: 5/5 CRITICAL fixed, 3/3 HIGH fixed, 4/4 MEDIUM fixed
|
||||
|
||||
**Seq Removal - Logging Cleanup (Dec 5, 2025):**
|
||||
- ✅ Removed Seq logging sink to eliminate commercial licensing concerns
|
||||
- ✅ Removed `Serilog.Sinks.Seq` NuGet package from DiunaBI.API.csproj
|
||||
- ✅ Removed Seq sink configuration from appsettings.Development.json
|
||||
- ✅ Kept Serilog (free, open-source) with Console + File sinks for production-ready logging
|
||||
- ✅ Build verified - no errors after Seq removal
|
||||
- Files modified: [DiunaBI.API.csproj](DiunaBI.API/DiunaBI.API.csproj), [appsettings.Development.json](DiunaBI.API/appsettings.Development.json)
|
||||
- Manual step required: Remove `seq` service from docker-compose.yml and add Docker log rotation config
|
||||
|
||||
**UI Reorganization (Dec 5, 2025):**
|
||||
- ✅ Moved pages to feature-based folders: `Pages/Layers/`, `Pages/Jobs/`, `Pages/DataInbox/`
|
||||
- ✅ Organized components: `Components/Layout/` (MainLayout, EmptyLayout, Routes), `Components/Auth/` (AuthGuard, LoginCard)
|
||||
- ✅ Removed obsolete wrapper files (LayerListPage, JobListPage, DataInboxListPage, etc.)
|
||||
- ✅ Removed duplicate component files (LayerListComponent, JobListComponent, DataInboxListComponent)
|
||||
- ✅ Standardized code-behind: `.razor.cs` for complex logic, inline `@code` for simple pages
|
||||
- ✅ Updated `_Imports.razor` with new namespaces: `DiunaBI.UI.Shared.Components.Layout`, `DiunaBI.UI.Shared.Components.Auth`
|
||||
- ✅ All routes unchanged - backward compatible
|
||||
|
||||
---
|
||||
|
||||
## PROJECT TYPE & TECH STACK
|
||||
|
||||
**Application Type:** Full-stack Business Intelligence (BI) platform with multi-tier architecture, real-time capabilities, and plugin system
|
||||
|
||||
**Core Stack:**
|
||||
- Backend: ASP.NET Core 10.0 Web API
|
||||
- Frontend: Blazor Server + MAUI Mobile
|
||||
- Database: SQL Server + EF Core 10.0
|
||||
- UI: MudBlazor 8.0
|
||||
- Real-time: SignalR (EntityChangeHub)
|
||||
- Google: Sheets API, Drive API, OAuth
|
||||
- Logging: Serilog (Console, File)
|
||||
- Auth: JWT Bearer + Google OAuth
|
||||
|
||||
---
|
||||
|
||||
## SOLUTION STRUCTURE (10 Projects)
|
||||
|
||||
```
|
||||
DiunaBI.API (Web API)
|
||||
├── Controllers: Auth, Layers, Jobs, DataInbox
|
||||
├── Hubs: EntityChangeHub (SignalR real-time updates)
|
||||
└── Services: GoogleAuth, JwtToken
|
||||
|
||||
DiunaBI.Domain (Entities)
|
||||
└── User, Layer, Record, RecordHistory, QueueJob, DataInbox, ProcessSource
|
||||
|
||||
DiunaBI.Application (DTOs)
|
||||
└── LayerDto, RecordDto, UserDto, RecordHistoryDto, PagedResult, JobDto
|
||||
|
||||
DiunaBI.Infrastructure (Data + Services)
|
||||
├── Data: AppDbContext, Migrations (47 total)
|
||||
├── Interceptors: EntityChangeInterceptor (auto-broadcasts DB changes)
|
||||
├── Services: PluginManager, JobScheduler, JobWorker, GoogleSheets/Drive
|
||||
├── Plugins: BaseDataImporter, BaseDataProcessor, BaseDataExporter
|
||||
└── Interfaces: IPlugin, IDataProcessor, IDataImporter, IDataExporter
|
||||
|
||||
DiunaBI.UI.Web (Blazor Server)
|
||||
└── Server-side Blazor web application
|
||||
|
||||
DiunaBI.UI.Mobile (MAUI)
|
||||
└── iOS, Android, Windows, macOS support
|
||||
|
||||
DiunaBI.UI.Shared (Blazor Component Library - Reorganized)
|
||||
├── Pages/
|
||||
│ ├── Layers/ (Index.razor, Details.razor)
|
||||
│ ├── Jobs/ (Index.razor, Details.razor)
|
||||
│ ├── DataInbox/ (Index.razor, Details.razor)
|
||||
│ ├── Dashboard.razor, Login.razor, Index.razor
|
||||
├── Components/
|
||||
│ ├── Layout/ (MainLayout, EmptyLayout, Routes)
|
||||
│ └── Auth/ (AuthGuard, LoginCard)
|
||||
└── Services/
|
||||
├── LayerService, JobService, DataInboxService
|
||||
├── EntityChangeHubService (SignalR client)
|
||||
├── FilterStateServices (remember filters)
|
||||
└── AuthService, TokenProvider
|
||||
|
||||
DiunaBI.Plugins.Morska (Feature Plugin)
|
||||
├── Importers: Standard, D1, D3, FK2 (4 total)
|
||||
├── Processors: D6, T1, T3, T4, T5 variants (12 total)
|
||||
└── Exporters: Google Sheets export (1)
|
||||
|
||||
DiunaBI.Plugins.PedrolloPL (Feature Plugin - NEW)
|
||||
└── Importers: B3 (1 total)
|
||||
|
||||
DiunaBI.Tests (Testing)
|
||||
└── Unit and integration tests
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## CORE FUNCTIONALITY
|
||||
|
||||
**Purpose:** BI platform for data import, processing, transformation via modular plugin architecture. Multi-layer workflows with audit trails, real-time notifications, scheduled job processing.
|
||||
|
||||
**Main Features:**
|
||||
1. **Layer Management** - 4 types (Import/Processed/Admin/Dictionary), parent-child relationships, soft deletes
|
||||
2. **Data Records** - 32 numeric columns (Value1-32) + description, hierarchical, full audit trail
|
||||
3. **Plugin Architecture** - Dynamic assembly loading, base classes in Infrastructure, 3 types (Importers/Processors/Exporters)
|
||||
4. **Job Queue System** - Background worker with retry logic (30s → 2m → 5m), priority-based, auto-scheduling
|
||||
5. **External Data** - DataInbox API, Google Sheets read/write, Google Drive integration
|
||||
6. **Real-time Updates** - SignalR broadcasts entity changes (create/update/delete) to all connected clients
|
||||
7. **Audit Trail** - RecordHistory tracks all record changes with field-level diffs and JSON summaries
|
||||
8. **Filter Persistence** - UI filter states saved across sessions (LayerFilterStateService, DataInboxFilterStateService)
|
||||
|
||||
---
|
||||
|
||||
## KEY ENTITIES
|
||||
|
||||
**Layer**
|
||||
- Id, Number, Name, Type (Import/Processed/Administration/Dictionary)
|
||||
- CreatedAt/ModifiedAt, CreatedBy/ModifiedBy (with user relations)
|
||||
- IsDeleted (soft delete), IsCancelled (processing control), ParentId
|
||||
- Relations: Records (1-to-many), ProcessSources (1-to-many)
|
||||
|
||||
**Record**
|
||||
- Id, Code (unique identifier), LayerId
|
||||
- Value1-Value32 (double?), Desc1 (string, max 10000 chars)
|
||||
- CreatedAt/ModifiedAt, CreatedBy/ModifiedBy, IsDeleted
|
||||
- Audit: Full history tracked in RecordHistory table
|
||||
|
||||
**RecordHistory** (NEW - Migration 47)
|
||||
- RecordId, LayerId, ChangedAt, ChangedById
|
||||
- ChangeType (Created/Updated/Deleted)
|
||||
- Code, Desc1 (snapshot at time of change)
|
||||
- ChangedFields (comma-separated field names)
|
||||
- ChangesSummary (JSON with old/new values)
|
||||
- Indexes: (RecordId, ChangedAt), (LayerId, ChangedAt) for performance
|
||||
|
||||
**QueueJob**
|
||||
- LayerId, LayerName, PluginName
|
||||
- JobType (Import/Process)
|
||||
- Priority (0 = highest), Status (Pending/Running/Completed/Failed/Retrying)
|
||||
- RetryCount, MaxRetries (default 5)
|
||||
- CreatedAt, LastAttemptAt, CompletedAt
|
||||
- LastError (detailed error message)
|
||||
|
||||
**DataInbox**
|
||||
- Id, Name, Source (identifiers)
|
||||
- Data (base64-encoded JSON array)
|
||||
- CreatedAt
|
||||
- Used by importers to stage incoming data
|
||||
|
||||
**User**
|
||||
- Id (Guid), Email, UserName
|
||||
- CreatedAt, LastLoginAt
|
||||
- Google OAuth identity
|
||||
|
||||
**ProcessSource**
|
||||
- Id, SourceLayerId, TargetLayerId
|
||||
- Defines layer processing relationships
|
||||
|
||||
---
|
||||
|
||||
## API ENDPOINTS
|
||||
|
||||
**Base:** `/` (ApiController routes)
|
||||
|
||||
### AuthController (/auth)
|
||||
- `POST /auth/apiToken` - Exchange Google ID token for JWT (AllowAnonymous)
|
||||
- `POST /auth/refresh` - Refresh expired JWT token
|
||||
|
||||
### LayersController (/layers)
|
||||
- `GET /layers?page=1&pageSize=10&search=&type=` - List layers (paged, filterable)
|
||||
- `GET /layers/{id}` - Get layer details with records
|
||||
- `POST /layers` - Create new layer
|
||||
- `PUT /layers/{id}` - Update layer
|
||||
- `DELETE /layers/{id}` - Soft delete layer
|
||||
- `POST /layers/{id}/records` - Add/update records
|
||||
- `PUT /layers/{layerId}/records/{recordId}` - Update specific record
|
||||
- `DELETE /layers/{layerId}/records/{recordId}` - Delete record
|
||||
- `GET /layers/{layerId}/records/{recordId}/history` - Get record history
|
||||
- `GET /layers/{layerId}/deleted-records` - Get deleted records with history
|
||||
|
||||
### JobsController (/jobs) - NEW
|
||||
- `GET /jobs?page=1&pageSize=50&status=&jobType=` - List jobs (paged, filterable)
|
||||
- `GET /jobs/{id}` - Get job details
|
||||
- `GET /jobs/stats` - Get job statistics (counts by status)
|
||||
- `POST /jobs/schedule/{apiKey}` - Schedule all jobs from layer configs
|
||||
- `POST /jobs/schedule/imports/{apiKey}` - Schedule import jobs only
|
||||
- `POST /jobs/schedule/processes/{apiKey}` - Schedule process jobs only
|
||||
- `POST /jobs/create-for-layer/{layerId}` - Create job for specific layer (manual trigger)
|
||||
- `POST /jobs/{id}/retry` - Retry failed job (resets to Pending)
|
||||
- `DELETE /jobs/{id}` - Cancel pending/retrying job
|
||||
|
||||
### DataInboxController (/datainbox)
|
||||
- `GET /datainbox?page=1&pageSize=10&search=` - List inbox items (paged, filterable)
|
||||
- `GET /datainbox/{id}` - Get inbox item with decoded data
|
||||
- `POST /datainbox` - Create inbox item
|
||||
- `PUT /datainbox/Add/{apiKey}` - Add data (API key + Basic Auth)
|
||||
- `DELETE /datainbox/{id}` - Delete inbox item
|
||||
|
||||
### SignalR Hub
|
||||
- `/hubs/entitychanges` - SignalR hub for real-time entity change notifications
|
||||
- Event: `EntityChanged(module, id, operation)` - broadcasts to all clients
|
||||
- Modules: QueueJobs, Layers, Records, RecordHistory
|
||||
|
||||
---
|
||||
|
||||
## AUTHENTICATION & SECURITY
|
||||
|
||||
**Flow:**
|
||||
1. Client exchanges Google ID token → `/auth/apiToken`
|
||||
2. GoogleAuthService validates token with Google, maps to internal User
|
||||
3. Returns JWT (7-day expiration, HS256 signing)
|
||||
4. JWT required on all protected endpoints (except /auth/apiToken, /health)
|
||||
5. UserId extraction middleware sets X-UserId header for audit trails
|
||||
|
||||
**Security:**
|
||||
- Google OAuth 2.0 for identity verification
|
||||
- JWT Bearer tokens for API access
|
||||
- API key + Basic Auth for DataInbox external endpoints
|
||||
- CORS configured for:
|
||||
- http://localhost:4200
|
||||
- https://diuna.bim-it.pl
|
||||
- https://morska.diunabi.com
|
||||
|
||||
---
|
||||
|
||||
## KEY SERVICES
|
||||
|
||||
### Infrastructure Services
|
||||
|
||||
**PluginManager**
|
||||
- Location: `DiunaBI.Infrastructure/Services/PluginManager.cs`
|
||||
- Loads plugin assemblies from `bin/Plugins/` directory at startup
|
||||
- Registers IDataProcessor, IDataImporter, IDataExporter implementations
|
||||
- Provides plugin discovery and execution
|
||||
|
||||
**JobSchedulerService**
|
||||
- Location: `DiunaBI.Infrastructure/Services/JobSchedulerService.cs`
|
||||
- Creates QueueJob entries from Administration layer configs
|
||||
- Reads layer.Records with Code="Plugin", Code="Priority", Code="MaxRetries"
|
||||
- Methods: ScheduleImportJobsAsync, ScheduleProcessJobsAsync, ScheduleAllJobsAsync
|
||||
|
||||
**JobWorkerService** (BackgroundService)
|
||||
- Location: `DiunaBI.Infrastructure/Services/JobWorkerService.cs`
|
||||
- Polls QueueJobs table every 10 seconds
|
||||
- Executes jobs via PluginManager (Import/Process)
|
||||
- Retry logic with exponential backoff: 30s → 2m → 5m delays
|
||||
- Rate limiting: 5-second delay after imports (Google Sheets API quota)
|
||||
- Updates job status in real-time (triggers SignalR broadcasts)
|
||||
|
||||
**EntityChangeInterceptor**
|
||||
- Location: `DiunaBI.Infrastructure/Interceptors/EntityChangeInterceptor.cs`
|
||||
- EF Core SaveChangesInterceptor
|
||||
- Captures entity changes: Added, Modified, Deleted
|
||||
- Broadcasts changes via SignalR EntityChangeHub after successful save
|
||||
- Uses reflection to avoid circular dependencies with IHubContext
|
||||
|
||||
**GoogleSheetsHelper**
|
||||
- Location: `DiunaBI.Infrastructure/Helpers/GoogleSheetsHelper.cs`
|
||||
- Google Sheets API v4 integration
|
||||
- Methods: ReadRange, WriteRange, CreateSpreadsheet, UpdateSpreadsheet
|
||||
|
||||
**GoogleDriveHelper**
|
||||
- Location: `DiunaBI.Infrastructure/Helpers/GoogleDriveHelper.cs`
|
||||
- Google Drive API v3 integration
|
||||
- Methods: UploadFile, ListFiles, MoveFile
|
||||
|
||||
**GoogleAuthService / JwtTokenService**
|
||||
- Authentication and token management
|
||||
- JWT generation and validation
|
||||
|
||||
### UI Services
|
||||
|
||||
**EntityChangeHubService**
|
||||
- Location: `DiunaBI.UI.Shared/Services/EntityChangeHubService.cs`
|
||||
- Singleton service for SignalR client connection
|
||||
- Auto-reconnect enabled
|
||||
- Event: `EntityChanged` - UI components subscribe for real-time updates
|
||||
- Initialized in MainLayout.OnInitializedAsync
|
||||
|
||||
**LayerService / JobService / DataInboxService**
|
||||
- HTTP clients for API communication
|
||||
- DTOs serialization/deserialization
|
||||
- Paged result handling
|
||||
|
||||
**LayerFilterStateService / DataInboxFilterStateService**
|
||||
- Persist filter state across navigation
|
||||
- Singleton services remember search, type, page selections
|
||||
|
||||
---
|
||||
|
||||
## DATABASE SCHEMA
|
||||
|
||||
**Total Migrations:** 47
|
||||
|
||||
**Latest Migrations:**
|
||||
|
||||
**Migration 47: RecordHistory (Dec 1, 2025)**
|
||||
- **NEW Table: RecordHistory**
|
||||
- Tracks all record changes (Created, Updated, Deleted)
|
||||
- Fields: Id, RecordId, LayerId, ChangedAt, ChangedById, ChangeType, Code, Desc1, ChangedFields, ChangesSummary
|
||||
- Indexes: IX_RecordHistory_RecordId_ChangedAt, IX_RecordHistory_LayerId_ChangedAt
|
||||
- Foreign key: RecordHistory.ChangedById → Users.Id
|
||||
|
||||
**Migration 46: FixLayerDefaultValues (Nov 20, 2025)**
|
||||
- Set default value: Layers.IsDeleted = false
|
||||
|
||||
**Migration 45: UpdateModel (Nov 19, 2025)**
|
||||
- Added GETUTCDATE() defaults for all timestamp fields
|
||||
- Changed foreign key constraints from CASCADE to RESTRICT:
|
||||
- Layers → Users (CreatedById, ModifiedById)
|
||||
- Records → Users (CreatedById, ModifiedById)
|
||||
- Added FK_ProcessSources_Layers_LayerId
|
||||
|
||||
**Core Tables:**
|
||||
- Users (authentication, audit)
|
||||
- Layers (4 types, soft deletes, parent-child)
|
||||
- Records (32 Value fields + Desc1, audit, soft deletes)
|
||||
- RecordHistory (change tracking, field diffs, JSON summaries)
|
||||
- QueueJobs (job queue, retry logic, status tracking)
|
||||
- DataInbox (incoming data staging, base64 encoded)
|
||||
- ProcessSources (layer relationships)
|
||||
|
||||
---
|
||||
|
||||
## PLUGIN SYSTEM
|
||||
|
||||
### Base Classes (Infrastructure/Plugins/)
|
||||
|
||||
**BaseDataImporter** (`DiunaBI.Infrastructure/Plugins/BaseDataImporter.cs`)
|
||||
- Abstract base for all importers
|
||||
- Methods: ImportAsync(layerId, jobId), ValidateConfiguration()
|
||||
- Access: AppDbContext, PluginManager, GoogleSheetsHelper, GoogleDriveHelper
|
||||
|
||||
**BaseDataProcessor** (`DiunaBI.Infrastructure/Plugins/BaseDataProcessor.cs`)
|
||||
- Abstract base for all processors
|
||||
- Methods: ProcessAsync(layerId, jobId), ValidateConfiguration()
|
||||
- Access: AppDbContext, PluginManager
|
||||
|
||||
**BaseDataExporter** (`DiunaBI.Infrastructure/Plugins/BaseDataExporter.cs`)
|
||||
- Abstract base for all exporters
|
||||
- Methods: ExportAsync(layerId, jobId), ValidateConfiguration()
|
||||
- Access: AppDbContext, GoogleSheetsHelper, GoogleDriveHelper
|
||||
|
||||
### Morska Plugin (DiunaBI.Plugins.Morska)
|
||||
|
||||
**Importers (4):**
|
||||
- MorskaStandardImporter - Generic CSV/Excel import
|
||||
- MorskaD1Importer - D1 data format
|
||||
- MorskaD3Importer - D3 data format
|
||||
- MorskaFK2Importer - FK2 data format
|
||||
|
||||
**Processors (12):**
|
||||
- MorskaD6Processor
|
||||
- MorskaT1R1Processor
|
||||
- MorskaT1R3Processor
|
||||
- MorskaT3SingleSourceProcessor
|
||||
- MorskaT3SourceYearSummaryProcessor
|
||||
- MorskaT3MultiSourceSummaryProcessor
|
||||
- MorskaT3MultiSourceYearSummaryProcessor
|
||||
- MorskaT4R2Processor
|
||||
- MorskaT4SingleSourceProcessor
|
||||
- MorskaT5LastValuesProcessor
|
||||
- MorskaT3MultiSourceCopySelectedCodesProcessor-TO_REMOVE (deprecated)
|
||||
- MorskaT3MultiSourceCopySelectedCodesYearSummaryProcessor-TO_REMOVE (deprecated)
|
||||
|
||||
**Exporters (1):**
|
||||
- googleSheet.export.cs - Google Sheets export
|
||||
|
||||
**Total:** ~6,566 lines of code
|
||||
|
||||
### PedrolloPL Plugin (DiunaBI.Plugins.PedrolloPL) - NEW
|
||||
|
||||
**Importers (1):**
|
||||
- **PedrolloPLImportB3** (`DiunaBI.Plugins.PedrolloPL/Importers/PedrolloPLImportB3.cs`)
|
||||
- Imports B3 data from DataInbox
|
||||
- Uses L1-D-B3-CODES dictionary layer for region code mapping
|
||||
- Creates 12 monthly records per region (Value1-Value12)
|
||||
- Generates Import layers: L{Number}-I-B3-{Year}-{Timestamp}
|
||||
- Handles base64 JSON data decoding
|
||||
|
||||
---
|
||||
|
||||
## UI STRUCTURE (DiunaBI.UI.Shared)
|
||||
|
||||
### Reorganized Structure (Dec 5, 2025)
|
||||
|
||||
**Pages/** (Routable pages with @page directive)
|
||||
```
|
||||
Pages/
|
||||
├── Layers/
|
||||
│ ├── Index.razor + Index.razor.cs - /layers (list with filters, pagination)
|
||||
│ └── Details.razor + Details.razor.cs - /layers/{id} (detail, edit, history)
|
||||
├── Jobs/
|
||||
│ ├── Index.razor + Index.razor.cs - /jobs (list with filters, real-time updates)
|
||||
│ └── Details.razor - /jobs/{id} (detail, retry, cancel, real-time)
|
||||
├── DataInbox/
|
||||
│ ├── Index.razor + Index.razor.cs - /datainbox (list with filters)
|
||||
│ └── Details.razor + Details.razor.cs - /datainbox/{id} (detail, base64 decode)
|
||||
├── Dashboard.razor - /dashboard (user info)
|
||||
├── Login.razor - /login (Google OAuth)
|
||||
└── Index.razor - / (redirects to /dashboard)
|
||||
```
|
||||
|
||||
**Components/** (Reusable components, no routes)
|
||||
```
|
||||
Components/
|
||||
├── Layout/
|
||||
│ ├── MainLayout.razor - Main app layout with drawer, nav menu
|
||||
│ ├── EmptyLayout.razor - Minimal layout for login page
|
||||
│ └── Routes.razor - Router configuration
|
||||
└── Auth/
|
||||
├── AuthGuard.razor - Authentication guard wrapper
|
||||
└── LoginCard.razor - Google login button component
|
||||
```
|
||||
|
||||
**Navigation Menu:**
|
||||
- Dashboard (/dashboard) - User profile
|
||||
- Layers (/layers) - Layer management
|
||||
- Data Inbox (/datainbox) - Incoming data review
|
||||
- Jobs (/jobs) - Job queue monitoring (with real-time status updates)
|
||||
|
||||
**Code-Behind Pattern:**
|
||||
- Complex pages (50+ lines logic): Separate `.razor.cs` files
|
||||
- Simple pages: Inline `@code` blocks
|
||||
- Namespaces: `DiunaBI.UI.Shared.Pages.{Feature}`
|
||||
|
||||
---
|
||||
|
||||
## REAL-TIME FEATURES (SignalR)
|
||||
|
||||
### Architecture
|
||||
|
||||
**Hub:** `DiunaBI.API/Hubs/EntityChangeHub.cs`
|
||||
- Endpoint: `/hubs/entitychanges`
|
||||
- Method: `SendEntityChange(string module, string id, string operation)`
|
||||
- Broadcasts: `EntityChanged` event to all connected clients
|
||||
|
||||
**Interceptor:** `DiunaBI.Infrastructure/Interceptors/EntityChangeInterceptor.cs`
|
||||
- EF Core SaveChangesInterceptor
|
||||
- Detects: Added, Modified, Deleted entities
|
||||
- Broadcasts: After successful SaveChanges
|
||||
- Modules: QueueJobs, Layers, Records, RecordHistory
|
||||
|
||||
**UI Service:** `DiunaBI.UI.Shared/Services/EntityChangeHubService.cs`
|
||||
- Singleton initialized in MainLayout
|
||||
- Auto-reconnect enabled
|
||||
- Components subscribe: `HubService.EntityChanged += OnEntityChanged`
|
||||
|
||||
### Real-time Update Flow
|
||||
|
||||
1. User action → API endpoint
|
||||
2. DbContext.SaveChangesAsync()
|
||||
3. EntityChangeInterceptor captures changes
|
||||
4. SignalR broadcast to all clients: `EntityChanged(module, id, operation)`
|
||||
5. UI components receive event and refresh data
|
||||
6. StateHasChanged() updates UI
|
||||
|
||||
**Example:** Job status changes appear instantly on JobDetailPage and JobListPage
|
||||
|
||||
---
|
||||
|
||||
## JOB QUEUE SYSTEM
|
||||
|
||||
### Components
|
||||
|
||||
**Entity:** `QueueJob` (DiunaBI.Domain/Entities/QueueJob.cs)
|
||||
- JobType: Import, Process
|
||||
- JobStatus: Pending, Running, Completed, Failed, Retrying
|
||||
- Priority: 0 = highest priority
|
||||
- Retry: 30s → 2m → 5m delays, max 5 attempts
|
||||
|
||||
**Scheduler:** `JobSchedulerService`
|
||||
- Reads Administration layer configs (Type=ImportWorker/ProcessWorker)
|
||||
- Auto-creates jobs based on layer.Records configuration
|
||||
- API endpoints: `/jobs/schedule/{apiKey}`, `/jobs/schedule/imports/{apiKey}`, `/jobs/schedule/processes/{apiKey}`
|
||||
|
||||
**Worker:** `JobWorkerService` (BackgroundService)
|
||||
- Polls every 10 seconds
|
||||
- Executes via PluginManager
|
||||
- Exponential backoff on failures
|
||||
- Rate limiting for Google API quota
|
||||
- Real-time status updates via SignalR
|
||||
|
||||
**UI:** `Pages/Jobs/`
|
||||
- Index.razor - Job list with filters, real-time updates
|
||||
- Details.razor - Job detail with retry/cancel, real-time status
|
||||
|
||||
### Job Lifecycle
|
||||
|
||||
1. **Creation** - JobSchedulerService or manual via API
|
||||
2. **Queued** - Status: Pending, sorted by Priority
|
||||
3. **Execution** - JobWorkerService picks up, Status: Running
|
||||
4. **Completion** - Status: Completed or Failed
|
||||
5. **Retry** - On failure, Status: Retrying with exponential backoff
|
||||
6. **Real-time** - All status changes broadcast via SignalR
|
||||
|
||||
**Statistics Endpoint:** `GET /jobs/stats`
|
||||
```json
|
||||
{
|
||||
"pending": 5,
|
||||
"running": 2,
|
||||
"completed": 150,
|
||||
"failed": 3,
|
||||
"retrying": 1,
|
||||
"total": 161
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## RECENT DEVELOPMENT
|
||||
|
||||
**Recent Commits (Dec 2-5, 2025):**
|
||||
- **193127b:** SignalR for realtime entitychanges (Dec 4)
|
||||
- **bf2beda, 942da18:** Build fixes (Dec 4)
|
||||
- **a3fa8f9:** B3 import is working (Dec 4)
|
||||
- **0e3b393:** WIP: b3 plugin (Dec 3)
|
||||
- **445c07a:** Morska plugins refactor (Dec 2)
|
||||
- **3f8e62f:** WIP: queue engine (Dec 2)
|
||||
- **248106a:** Plugins little refactor (Dec 2)
|
||||
- **587d4d6:** Pedrollo plugins (Dec 2)
|
||||
- **e70a8dd:** Remember list filters (Dec 2)
|
||||
- **89859cd:** Record history is working (Dec 1)
|
||||
|
||||
**Development Focus (Last 30 Days):**
|
||||
1. ✅ Real-time updates (SignalR integration)
|
||||
2. ✅ Job queue system (background worker, retry logic)
|
||||
3. ✅ PedrolloPL plugin (B3 importer)
|
||||
4. ✅ Record history tracking (audit trail)
|
||||
5. ✅ UI reorganization (feature-based folders)
|
||||
6. ✅ Plugin refactoring (base classes in Infrastructure)
|
||||
7. ✅ Filter persistence (UI state management)
|
||||
|
||||
**Major Features Added:**
|
||||
- SignalR real-time entity change notifications
|
||||
- Background job processing with retry logic
|
||||
- Record history with field-level diffs
|
||||
- PedrolloPL B3 data importer
|
||||
- UI reorganization (Pages/Layers, Pages/Jobs, Pages/DataInbox)
|
||||
- Filter state persistence across sessions
|
||||
|
||||
---
|
||||
|
||||
## CONFIGURATION
|
||||
|
||||
**Key Settings (appsettings.Development.json):**
|
||||
- ConnectionStrings:SQLDatabase - SQL Server (localhost:21433, DB: DiunaBI-PedrolloPL)
|
||||
- JwtSettings:SecurityKey, ExpiryDays (7)
|
||||
- GoogleAuth:ClientId, RedirectUri
|
||||
- apiKey, apiUser, apiPass - DataInbox API security
|
||||
- exportDirectory - Google Drive folder ID for exports
|
||||
- apiLocalUrl - localhost:5400
|
||||
- InstanceName - DEV/PROD environment identifier
|
||||
|
||||
**Logging Configuration:**
|
||||
```json
|
||||
"Serilog": {
|
||||
"MinimumLevel": {
|
||||
"Default": "Information",
|
||||
"Override": {
|
||||
"Microsoft.AspNetCore": "Warning",
|
||||
"Microsoft.EntityFrameworkCore.Database.Command": "Warning",
|
||||
"Microsoft.EntityFrameworkCore.Infrastructure": "Warning",
|
||||
"System.Net.Http.HttpClient": "Warning",
|
||||
"Google.Apis": "Warning",
|
||||
"DiunaBI.Core.Services.PluginManager": "Information"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**CORS Origins:**
|
||||
- http://localhost:4200 (development)
|
||||
- https://diuna.bim-it.pl (production)
|
||||
- https://morska.diunabi.com (production)
|
||||
|
||||
---
|
||||
|
||||
## PATTERNS & ARCHITECTURE
|
||||
|
||||
**Design Patterns:**
|
||||
- Clean Architecture (Domain → Application → Infrastructure → API)
|
||||
- Plugin Pattern (dynamic loading, base classes, interface contracts)
|
||||
- Interceptor Pattern (EF Core SaveChangesInterceptor for change tracking)
|
||||
- Hub Pattern (SignalR for real-time notifications)
|
||||
- Service Pattern (dependency injection throughout)
|
||||
- Repository Pattern (EF Core DbContext as repository)
|
||||
- Background Service Pattern (JobWorkerService for async processing)
|
||||
|
||||
**Tech Versions:**
|
||||
- .NET 10.0 (upgraded from .NET 8.0)
|
||||
- EF Core 10.0
|
||||
- C# 13.0
|
||||
- Blazor Server (net10.0)
|
||||
- MAUI (net10.0-ios/android/windows/macos)
|
||||
- MudBlazor 8.0
|
||||
|
||||
**Architectural Decisions:**
|
||||
- Plugin base classes in Infrastructure for reusability
|
||||
- SignalR for real-time updates (no polling)
|
||||
- Background service for job processing (no external scheduler)
|
||||
- Soft deletes with audit trails
|
||||
- Foreign key RESTRICT to prevent accidental cascades
|
||||
- Feature-based folder structure in UI
|
||||
|
||||
---
|
||||
|
||||
## QUICK REFERENCE
|
||||
|
||||
**Database:**
|
||||
- SQL Server with 47 EF Core migrations
|
||||
- Auto-timestamps via GETUTCDATE() defaults
|
||||
- Soft deletes (IsDeleted flag)
|
||||
- Audit trails (CreatedBy, ModifiedBy, RecordHistory table)
|
||||
|
||||
**Build Process:**
|
||||
- MSBuild target copies plugin DLLs to `bin/Plugins/` after build
|
||||
- Plugins: DiunaBI.Plugins.Morska.dll, DiunaBI.Plugins.PedrolloPL.dll
|
||||
|
||||
**SignalR:**
|
||||
- Hub: `/hubs/entitychanges`
|
||||
- Broadcasts: `EntityChanged(module, id, operation)`
|
||||
- Auto-reconnect enabled in UI
|
||||
- Real-time updates for QueueJobs, Layers, Records
|
||||
|
||||
**Job Queue:**
|
||||
- Auto-scheduling from layer configs (Type=ImportWorker/ProcessWorker)
|
||||
- Background processing every 10 seconds
|
||||
- Retry logic: 30s → 2m → 5m (max 5 retries)
|
||||
- Priority-based execution (0 = highest)
|
||||
- Real-time status updates via SignalR
|
||||
|
||||
**Plugins:**
|
||||
- **Morska:** 4 importers, 12 processors, 1 exporter (~6,566 LOC)
|
||||
- **PedrolloPL:** 1 importer (B3 data)
|
||||
- Base classes: BaseDataImporter, BaseDataProcessor, BaseDataExporter
|
||||
- Dynamic loading from `bin/Plugins/` at startup
|
||||
|
||||
**UI Structure:**
|
||||
- Feature-based folders: Pages/Layers, Pages/Jobs, Pages/DataInbox
|
||||
- Separate code-behind for complex logic (.razor.cs files)
|
||||
- Inline @code for simple pages
|
||||
- Organized components: Layout/, Auth/
|
||||
- Filter state persistence across navigation
|
||||
|
||||
---
|
||||
|
||||
## FILE PATHS REFERENCE
|
||||
|
||||
**Key Configuration:**
|
||||
- API: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.API/appsettings.json`
|
||||
- API Startup: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.API/Program.cs`
|
||||
|
||||
**SignalR:**
|
||||
- Hub: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.API/Hubs/EntityChangeHub.cs`
|
||||
- Interceptor: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.Infrastructure/Interceptors/EntityChangeInterceptor.cs`
|
||||
- UI Service: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.UI.Shared/Services/EntityChangeHubService.cs`
|
||||
|
||||
**Job System:**
|
||||
- Controller: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.API/Controllers/JobsController.cs`
|
||||
- Scheduler: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.Infrastructure/Services/JobSchedulerService.cs`
|
||||
- Worker: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.Infrastructure/Services/JobWorkerService.cs`
|
||||
- UI Pages: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.UI.Shared/Pages/Jobs/`
|
||||
|
||||
**Plugins:**
|
||||
- Base Classes: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.Infrastructure/Plugins/`
|
||||
- Morska: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.Plugins.Morska/`
|
||||
- PedrolloPL: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.Plugins.PedrolloPL/`
|
||||
|
||||
**Migrations:**
|
||||
- Latest: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.Infrastructure/Migrations/20251201165810_RecordHistory.cs`
|
||||
|
||||
**UI Components:**
|
||||
- Pages: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.UI.Shared/Pages/`
|
||||
- Components: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.UI.Shared/Components/`
|
||||
- Services: `/Users/mz/Projects/Diuna/DiunaBI/DiunaBI.UI.Shared/Services/`
|
||||
@@ -13,6 +13,13 @@ concurrency:
|
||||
jobs:
|
||||
test:
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
customer:
|
||||
- name: Morska
|
||||
plugin_project: DiunaBI.Plugins.Morska
|
||||
- name: PedrolloPL
|
||||
plugin_project: DiunaBI.Plugins.PedrolloPL
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: https://github.com/actions/checkout@v4
|
||||
@@ -25,22 +32,23 @@ jobs:
|
||||
- name: Restore dependencies
|
||||
working-directory: .
|
||||
run: |
|
||||
dotnet restore ${{ matrix.customer.plugin_project }}/${{ matrix.customer.plugin_project }}.csproj
|
||||
dotnet restore DiunaBI.API/DiunaBI.API.csproj
|
||||
dotnet restore DiunaBI.UI.Web/DiunaBI.UI.Web.csproj
|
||||
dotnet restore DiunaBI.Plugins.Morska/DiunaBI.Plugins.Morska.csproj
|
||||
dotnet restore DiunaBI.Tests/DiunaBI.Tests.csproj
|
||||
|
||||
- name: Build solution and prepare plugins
|
||||
working-directory: .
|
||||
run: |
|
||||
set -e
|
||||
# Build only required projects — skip DiunaBI.UI.Mobile
|
||||
dotnet build DiunaBI.API/DiunaBI.API.csproj --configuration Release
|
||||
dotnet build DiunaBI.UI.Web/DiunaBI.UI.Web.csproj --configuration Release
|
||||
dotnet build DiunaBI.Plugins.Morska/DiunaBI.Plugins.Morska.csproj --configuration Release
|
||||
# Build plugin first to avoid missing dependency issues
|
||||
dotnet build ${{ matrix.customer.plugin_project }}/${{ matrix.customer.plugin_project }}.csproj --configuration Release --no-restore
|
||||
# Skip automatic plugin copy in API build since we only have one plugin restored
|
||||
dotnet build DiunaBI.API/DiunaBI.API.csproj --configuration Release --no-restore -p:SkipPluginCopy=true
|
||||
dotnet build DiunaBI.UI.Web/DiunaBI.UI.Web.csproj --configuration Release --no-restore
|
||||
|
||||
mkdir -p DiunaBI.Tests/bin/Release/net10.0/Plugins
|
||||
cp DiunaBI.Plugins.Morska/bin/Release/net10.0/DiunaBI.Plugins.Morska.dll DiunaBI.Tests/bin/Release/net10.0/Plugins/ || true
|
||||
cp ${{ matrix.customer.plugin_project }}/bin/Release/net10.0/${{ matrix.customer.plugin_project }}.dll DiunaBI.Tests/bin/Release/net10.0/Plugins/ || true
|
||||
ls -la DiunaBI.Tests/bin/Release/net10.0/Plugins/ || true
|
||||
|
||||
- name: Run Tests
|
||||
@@ -49,7 +57,7 @@ jobs:
|
||||
dotnet test DiunaBI.Tests/DiunaBI.Tests.csproj \
|
||||
--configuration Release \
|
||||
--no-restore \
|
||||
--logger "trx;LogFileName=test-results.trx" \
|
||||
--logger "trx;LogFileName=test-results-${{ matrix.customer.name }}.trx" \
|
||||
--collect:"XPlat Code Coverage" \
|
||||
--filter "Category!=LocalOnly" || true
|
||||
|
||||
@@ -57,7 +65,7 @@ jobs:
|
||||
uses: https://github.com/actions/upload-artifact@v3
|
||||
if: success() || failure()
|
||||
with:
|
||||
name: test-results
|
||||
name: test-results-${{ matrix.customer.name }}
|
||||
path: |
|
||||
DiunaBI.Tests/TestResults/*.trx
|
||||
DiunaBI.Tests/TestResults/**/coverage.cobertura.xml
|
||||
@@ -67,6 +75,15 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
needs: test
|
||||
if: success() || failure()
|
||||
strategy:
|
||||
matrix:
|
||||
customer:
|
||||
- name: Morska
|
||||
plugin_project: DiunaBI.Plugins.Morska
|
||||
image_suffix: morska
|
||||
- name: PedrolloPL
|
||||
plugin_project: DiunaBI.Plugins.PedrolloPL
|
||||
image_suffix: pedrollopl
|
||||
|
||||
steps:
|
||||
- name: Debug secrets
|
||||
@@ -93,9 +110,10 @@ jobs:
|
||||
docker buildx build \
|
||||
--platform linux/amd64 \
|
||||
--label "org.opencontainers.image.source=https://code.bim-it.pl/mz/DiunaBI" \
|
||||
--build-arg PLUGIN_PROJECT=${{ matrix.customer.plugin_project }} \
|
||||
-f DiunaBI.API/Dockerfile \
|
||||
-t code.bim-it.pl/mz/diunabi-api:latest \
|
||||
-t code.bim-it.pl/mz/diunabi-api:build-${{ github.run_id }} \
|
||||
-t code.bim-it.pl/mz/diunabi-api-${{ matrix.customer.image_suffix }}:latest \
|
||||
-t code.bim-it.pl/mz/diunabi-api-${{ matrix.customer.image_suffix }}:build-${{ github.run_id }} \
|
||||
--push \
|
||||
.
|
||||
|
||||
@@ -106,25 +124,26 @@ jobs:
|
||||
--platform linux/amd64 \
|
||||
--label "org.opencontainers.image.source=https://code.bim-it.pl/mz/DiunaBI" \
|
||||
-f DiunaBI.UI.Web/Dockerfile \
|
||||
-t code.bim-it.pl/mz/diunabi-ui:latest \
|
||||
-t code.bim-it.pl/mz/diunabi-ui:build-${{ github.run_id }} \
|
||||
-t code.bim-it.pl/mz/diunabi-ui-${{ matrix.customer.image_suffix }}:latest \
|
||||
-t code.bim-it.pl/mz/diunabi-ui-${{ matrix.customer.image_suffix }}:build-${{ github.run_id }} \
|
||||
--push \
|
||||
.
|
||||
|
||||
- name: Output build info
|
||||
run: |
|
||||
echo "## 🐳 Docker Images Built" >> $GITHUB_STEP_SUMMARY
|
||||
echo "## 🐳 Docker Images Built - ${{ matrix.customer.name }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Build ID:** ${{ github.run_id }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Commit:** ${{ github.sha }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Customer:** ${{ matrix.customer.name }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "### Images pushed:" >> $GITHUB_STEP_SUMMARY
|
||||
echo '```bash' >> $GITHUB_STEP_SUMMARY
|
||||
echo "# Latest (for release)" >> $GITHUB_STEP_SUMMARY
|
||||
echo "docker pull code.bim-it.pl/mz/diunabi-api:latest" >> $GITHUB_STEP_SUMMARY
|
||||
echo "docker pull code.bim-it.pl/mz/diunabi-ui:latest" >> $GITHUB_STEP_SUMMARY
|
||||
echo "docker pull code.bim-it.pl/mz/diunabi-api-${{ matrix.customer.image_suffix }}:latest" >> $GITHUB_STEP_SUMMARY
|
||||
echo "docker pull code.bim-it.pl/mz/diunabi-ui-${{ matrix.customer.image_suffix }}:latest" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "# Specific build (for rollback)" >> $GITHUB_STEP_SUMMARY
|
||||
echo "docker pull code.bim-it.pl/mz/diunabi-api:build-${{ github.run_id }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "docker pull code.bim-it.pl/mz/diunabi-ui:build-${{ github.run_id }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "docker pull code.bim-it.pl/mz/diunabi-api-${{ matrix.customer.image_suffix }}:build-${{ github.run_id }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "docker pull code.bim-it.pl/mz/diunabi-ui-${{ matrix.customer.image_suffix }}:build-${{ github.run_id }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
7
.gitignore
vendored
7
.gitignore
vendored
@@ -563,3 +563,10 @@ coverage/
|
||||
##
|
||||
tmp/
|
||||
temp/
|
||||
|
||||
##
|
||||
## LocalDB Development Files
|
||||
##
|
||||
DevTools/LocalDB/backups/*.bak
|
||||
DevTools/LocalDB/backups/*.bacpac
|
||||
DevTools/LocalDB/data/
|
||||
11
.vscode/launch.json
vendored
11
.vscode/launch.json
vendored
@@ -30,17 +30,6 @@
|
||||
},
|
||||
"env": {
|
||||
"ASPNETCORE_ENVIRONMENT": "Development"
|
||||
},
|
||||
"launchBrowser": {
|
||||
"enabled": true,
|
||||
"args": "${auto-detect-url}",
|
||||
"browser": [
|
||||
{
|
||||
"osx": "Google Chrome",
|
||||
"linux": "chrome",
|
||||
"windows": "chrome"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
|
||||
@@ -1,9 +1 @@
|
||||
PUT https://pedrollopl.diunabi.com/api/DataInbox/Add/8kL2mN4pQ6rojshf8704i34p4eim1hs
|
||||
Content-Type: application/json
|
||||
Authorization: Basic cGVkcm9sbG9wbDo0MjU4dlc2eFk4TjRwUQ==
|
||||
|
||||
{
|
||||
"Source": "morska.import",
|
||||
"Name": "morska.d3.importer",
|
||||
"Data": "eyJrZXkiOiAidmFsdWUifQ=="
|
||||
}
|
||||
POST http://localhost:5400/jobs/schedule/10763478CB738D4ecb2h76g803478CB738D4e
|
||||
101
DevTools/sql-scripts/PedrolloPL/PedrolloImport.sql
Normal file
101
DevTools/sql-scripts/PedrolloPL/PedrolloImport.sql
Normal file
@@ -0,0 +1,101 @@
|
||||
DECLARE @JustForDebug TINYINT = 0;
|
||||
|
||||
-- FIX DATAINBOX!
|
||||
|
||||
-- SETUP VARIABLES
|
||||
DECLARE @Year INT = 2025;
|
||||
DECLARE @Type NVARCHAR(5) = 'B3';
|
||||
DECLARE @StartDate NVARCHAR(10) = '2025.01.02';
|
||||
DECLARE @EndDate NVARCHAR(10) = '2026.12.31'
|
||||
|
||||
|
||||
DECLARE @Number INT = (SELECT COUNT(id) + 1 FROM [DiunaBI-PedrolloPL].[dbo].[Layers]);
|
||||
DECLARE @CurrentTimestamp NVARCHAR(14) = FORMAT(GETDATE(), 'yyyyMMddHHmm');
|
||||
DECLARE @Name NVARCHAR(50) = CONCAT(
|
||||
'L', @Number, '-A-IW_', @Type, '-', @Year,'-', @CurrentTimestamp
|
||||
);
|
||||
DECLARE @Plugin NVARCHAR(100);
|
||||
SET @Plugin =
|
||||
CASE @Type
|
||||
WHEN 'B3' THEN 'PedrolloPL.Import.B3'
|
||||
ELSE NULL -- If @Type doesn't match, set it to NULL
|
||||
END;
|
||||
|
||||
DECLARE @DataInboxName NVARCHAR(100);
|
||||
SET @DataInboxName =
|
||||
CASE @Type
|
||||
WHEN 'B3' THEN 'P2_2025'
|
||||
ELSE NULL -- If @Type doesn't match, set it to NULL
|
||||
END;
|
||||
|
||||
DECLARE @DataInboxSource NVARCHAR(100);
|
||||
SET @DataInboxSource =
|
||||
CASE @Type
|
||||
WHEN 'B3' THEN 'Comarch'
|
||||
ELSE NULL -- If @Type doesn't match, set it to NULL
|
||||
END;
|
||||
|
||||
|
||||
DECLARE @LayerId UNIQUEIDENTIFIER = NEWID();
|
||||
|
||||
SELECT @Name AS Name, @StartDate AS StartDate, @EndDate AS EndDate, @Type AS Type, @Year AS Year, @Plugin AS Plugin,
|
||||
@DataInboxName AS DataInboxName, @DataInboxSource AS DataInboxSource;
|
||||
|
||||
IF @JustForDebug = 1
|
||||
BEGIN
|
||||
SELECT 'Just for debug' AS Logger;
|
||||
RETURN;
|
||||
END;
|
||||
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Layers]
|
||||
([Id], [Number], [Name], [CreatedAt], [ModifiedAt], [IsDeleted], [IsCancelled], [CreatedById], [ModifiedById], [Type])
|
||||
VALUES (@LayerId, @Number, @Name, GETDATE(), GETDATE(), 0, 0, '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 2);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'StartDate', @StartDate, GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'EndDate', @EndDate, GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'Source', 'DataInbox', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'ImportName', @Type, GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'ImportYear', @Year, GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'Type', 'ImportWorker', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'Plugin', @Plugin, GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'IsEnabled', 'True', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'DataInboxName', @DataInboxName, GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'DataInboxSource', @DataInboxSource, GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'Priority', '10', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'MaxRetries', '3', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
71
DevTools/sql-scripts/PedrolloPL/PedrolloProcessP2.sql
Normal file
71
DevTools/sql-scripts/PedrolloPL/PedrolloProcessP2.sql
Normal file
@@ -0,0 +1,71 @@
|
||||
DECLARE @JustForDebug TINYINT = 0;
|
||||
|
||||
-- SETUP VARIABLES
|
||||
DECLARE @Year INT = 2025;
|
||||
|
||||
DECLARE @Number INT = (SELECT COUNT(id) + 1 FROM [DiunaBI-PedrolloPL].[dbo].[Layers]);
|
||||
DECLARE @CurrentTimestamp NVARCHAR(14) = FORMAT(GETDATE(), 'yyyyMMddHHmm');
|
||||
DECLARE @Name NVARCHAR(50) = CONCAT(
|
||||
'L', @Number, '-A-PW_P2-', @Year, '-', @CurrentTimestamp
|
||||
);
|
||||
DECLARE @SourceNameFilter NVARCHAR(50) = CONCAT('%-A-IW_B3', '-', @Year, '-%');
|
||||
DECLARE @SourceLayer NVARCHAR(50) = (SELECT TOP 1 [Name] FROM [DiunaBI-PedrolloPL].[dbo].[Layers] WHERE [Name] LIKE @SourceNameFilter);
|
||||
IF @SourceLayer IS NULL
|
||||
BEGIN
|
||||
SELECT 'SourceLayer is NULL' AS Logger;
|
||||
RETURN;
|
||||
END;
|
||||
DECLARE @LayerId UNIQUEIDENTIFIER = NEWID();
|
||||
|
||||
SELECT @Name AS Name, @SourceLayer AS SourceLayer;
|
||||
|
||||
IF @JustForDebug = 1
|
||||
BEGIN
|
||||
SELECT 'Just for debug' AS Logger;
|
||||
RETURN;
|
||||
END;
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Layers]
|
||||
([Id], [Number], [Name], [CreatedAt], [ModifiedAt], [IsDeleted], [IsCancelled], [CreatedById], [ModifiedById], [Type])
|
||||
VALUES (@LayerId, @Number, @Name, GETDATE(), GETDATE(), 0, 0, '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 2);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'Source', 'B3', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'SourceLayer', @SourceLayer, GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'Type', 'ProcessWorker', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'IsEnabled', 'True', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'Year', @Year, GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'Plugin', 'PedrolloPL.Process.P2', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'Priority', '110', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
--
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'GoogleSheetId', '1jI-3QrlBADm5slEl2Balf29cKmHwkYi4pboaHY-gRqc', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'GoogleSheetTab', 'P2_Export_DiunaBI', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES ((SELECT NEWID()), 'GoogleSheetRange', 'C32:O48', GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
@@ -2,9 +2,9 @@
|
||||
DECLARE @JustForDebug TINYINT = 0;
|
||||
|
||||
-- SETUP VARIABLES
|
||||
DECLARE @Number INT = (SELECT COUNT(id) + 1 FROM [diunabi-morska].[dbo].[Layers]);
|
||||
DECLARE @Number INT = (SELECT COUNT(id) + 1 FROM [DiunaBI-PedrolloPL].[dbo].[Layers]);
|
||||
DECLARE @Name NVARCHAR(50) = CONCAT(
|
||||
'L', @Number, '-D-D6-SELL-CODES'
|
||||
'L', @Number, 'D-P2-CODES'
|
||||
);
|
||||
DECLARE @LayerId UNIQUEIDENTIFIER = NEWID();
|
||||
|
||||
@@ -16,7 +16,7 @@ BEGIN
|
||||
RETURN;
|
||||
END;
|
||||
|
||||
INSERT INTO [diunabi-morska].[dbo].[Layers]
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Layers]
|
||||
([Id], [Number], [Name], [CreatedAt], [ModifiedAt], [IsDeleted], [CreatedById], [ModifiedById], [Type])
|
||||
VALUES (@LayerId, @Number, @Name, GETDATE(), GETDATE(), 0, '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 3);
|
||||
|
||||
@@ -27,16 +27,23 @@ DECLARE @Array TABLE (
|
||||
|
||||
INSERT INTO @Array (Code, Desc1)
|
||||
VALUES
|
||||
('1002', '1102'),
|
||||
('1003','1202'),
|
||||
('1008','1302'),
|
||||
('1009','1302'),
|
||||
('9085','1203'),
|
||||
('1010','1304'),
|
||||
('9086','1005'),
|
||||
('1021','1206'),
|
||||
('9089','1207'),
|
||||
('9091','1208')
|
||||
('01','<nieznany>'),
|
||||
('02','DOLNOŚLĄSKIE'),
|
||||
('03','KUJAWSKO-POMORSKIE'),
|
||||
('04','LUBELSKIE'),
|
||||
('05','LUBUSKIE'),
|
||||
('06','ŁÓDZKIE'),
|
||||
('07','MAŁOPOLSKIE'),
|
||||
('08','MAZOWIECKIE'),
|
||||
('09','OPOLSKIE'),
|
||||
('10','PODKARPACKIE'),
|
||||
('11','PODLASKIE'),
|
||||
('12','POMORSKIE'),
|
||||
('13','ŚLĄSKIE'),
|
||||
('14','ŚWIĘTOKRZYSKIE'),
|
||||
('15','WARMIŃSKO-MAZURSKIE'),
|
||||
('16','WIELKOPOLSKIE'),
|
||||
('17','ZACHODNIOPOMORSKIE');
|
||||
|
||||
-- Loop through the array and insert into the target table
|
||||
DECLARE @Code NVARCHAR(50);
|
||||
@@ -51,7 +58,7 @@ FETCH NEXT FROM CursorArray INTO @Code, @Desc1;
|
||||
|
||||
WHILE @@FETCH_STATUS = 0
|
||||
BEGIN
|
||||
INSERT INTO [diunabi-morska].[dbo].[Records]
|
||||
INSERT INTO [DiunaBI-PedrolloPL].[dbo].[Records]
|
||||
([Id], [Code], [Desc1], [CreatedAt], [ModifiedAt], [CreatedById], [ModifiedById], [IsDeleted], [LayerId])
|
||||
VALUES (NEWID(), @Code, @Desc1, GETDATE(), GETDATE(), '117be4f0-b5d1-41a1-a962-39dc30cce368', '117be4f0-b5d1-41a1-a962-39dc30cce368', 0, @LayerId);
|
||||
|
||||
|
||||
63
DiunaBI.API/Attributes/ApiKeyAuthAttribute.cs
Normal file
63
DiunaBI.API/Attributes/ApiKeyAuthAttribute.cs
Normal file
@@ -0,0 +1,63 @@
|
||||
using System.Security.Cryptography;
|
||||
using System.Text;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
using Microsoft.AspNetCore.Mvc.Filters;
|
||||
|
||||
namespace DiunaBI.API.Attributes;
|
||||
|
||||
/// <summary>
|
||||
/// Authorization attribute that validates API key from X-API-Key header.
|
||||
/// Uses constant-time comparison to prevent timing attacks.
|
||||
/// </summary>
|
||||
[AttributeUsage(AttributeTargets.Class | AttributeTargets.Method)]
|
||||
public class ApiKeyAuthAttribute : Attribute, IAuthorizationFilter
|
||||
{
|
||||
private const string ApiKeyHeaderName = "X-API-Key";
|
||||
|
||||
public void OnAuthorization(AuthorizationFilterContext context)
|
||||
{
|
||||
var configuration = context.HttpContext.RequestServices.GetRequiredService<IConfiguration>();
|
||||
var logger = context.HttpContext.RequestServices.GetRequiredService<ILogger<ApiKeyAuthAttribute>>();
|
||||
|
||||
// Get expected API key from configuration
|
||||
var expectedApiKey = configuration["apiKey"];
|
||||
if (string.IsNullOrEmpty(expectedApiKey))
|
||||
{
|
||||
logger.LogError("API key not configured in appsettings");
|
||||
context.Result = new StatusCodeResult(StatusCodes.Status500InternalServerError);
|
||||
return;
|
||||
}
|
||||
|
||||
// Get API key from header
|
||||
if (!context.HttpContext.Request.Headers.TryGetValue(ApiKeyHeaderName, out var extractedApiKey))
|
||||
{
|
||||
logger.LogWarning("API key missing from request header");
|
||||
context.Result = new UnauthorizedObjectResult(new { error = "API key is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Constant-time comparison to prevent timing attacks
|
||||
if (!IsApiKeyValid(extractedApiKey!, expectedApiKey))
|
||||
{
|
||||
logger.LogWarning("Invalid API key provided from {RemoteIp}", context.HttpContext.Connection.RemoteIpAddress);
|
||||
context.Result = new UnauthorizedObjectResult(new { error = "Invalid API key" });
|
||||
return;
|
||||
}
|
||||
|
||||
// API key is valid - allow the request to proceed
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Constant-time string comparison to prevent timing attacks.
|
||||
/// </summary>
|
||||
private static bool IsApiKeyValid(string providedKey, string expectedKey)
|
||||
{
|
||||
if (providedKey == null || expectedKey == null)
|
||||
return false;
|
||||
|
||||
var providedBytes = Encoding.UTF8.GetBytes(providedKey);
|
||||
var expectedBytes = Encoding.UTF8.GetBytes(expectedKey);
|
||||
|
||||
return CryptographicOperations.FixedTimeEquals(providedBytes, expectedBytes);
|
||||
}
|
||||
}
|
||||
@@ -2,6 +2,7 @@ using DiunaBI.API.Services;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
using Microsoft.AspNetCore.RateLimiting;
|
||||
|
||||
namespace DiunaBI.API.Controllers;
|
||||
|
||||
@@ -15,6 +16,7 @@ public class AuthController(
|
||||
: ControllerBase
|
||||
{
|
||||
[HttpPost("apiToken")]
|
||||
[EnableRateLimiting("auth")]
|
||||
public async Task<IActionResult> ApiToken([FromBody] string idToken)
|
||||
{
|
||||
try
|
||||
|
||||
@@ -64,11 +64,21 @@ public class DataInboxController : Controller
|
||||
}
|
||||
|
||||
// check if datainbox.data is base64 encoded value
|
||||
if (!string.IsNullOrEmpty(dataInbox.Data) && !IsBase64String(dataInbox.Data))
|
||||
if (!string.IsNullOrEmpty(dataInbox.Data))
|
||||
{
|
||||
// Limit data size to 10MB to prevent DoS
|
||||
if (dataInbox.Data.Length > 10_000_000)
|
||||
{
|
||||
_logger.LogWarning("DataInbox: Data too large for source {Source}, size {Size}", dataInbox.Source, dataInbox.Data.Length);
|
||||
return BadRequest("Data too large (max 10MB)");
|
||||
}
|
||||
|
||||
if (!IsBase64String(dataInbox.Data))
|
||||
{
|
||||
_logger.LogWarning("DataInbox: Invalid data format - not base64 encoded for source {Source}", dataInbox.Source);
|
||||
return BadRequest("Invalid data format - not base64 encoded");
|
||||
}
|
||||
}
|
||||
|
||||
dataInbox.Id = Guid.NewGuid();
|
||||
dataInbox.CreatedAt = DateTime.UtcNow;
|
||||
@@ -87,7 +97,7 @@ public class DataInboxController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "DataInbox: Insert error for source {Source}, name {Name}", dataInbox.Source, dataInbox.Name);
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -97,6 +107,16 @@ public class DataInboxController : Controller
|
||||
{
|
||||
try
|
||||
{
|
||||
// Validate pagination parameters
|
||||
if (limit <= 0 || limit > 1000)
|
||||
{
|
||||
return BadRequest("Limit must be between 1 and 1000");
|
||||
}
|
||||
if (start < 0)
|
||||
{
|
||||
return BadRequest("Start must be non-negative");
|
||||
}
|
||||
|
||||
var query = _db.DataInbox.AsQueryable();
|
||||
|
||||
if (!string.IsNullOrEmpty(search))
|
||||
@@ -137,7 +157,7 @@ public class DataInboxController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "GetAll: Error retrieving data inbox items");
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -172,7 +192,7 @@ public class DataInboxController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "Get: Error retrieving data inbox item {Id}", id);
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
507
DiunaBI.API/Controllers/JobsController.cs
Normal file
507
DiunaBI.API/Controllers/JobsController.cs
Normal file
@@ -0,0 +1,507 @@
|
||||
using DiunaBI.API.Attributes;
|
||||
using DiunaBI.Application.DTOModels.Common;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
|
||||
namespace DiunaBI.API.Controllers;
|
||||
|
||||
[Authorize]
|
||||
[ApiController]
|
||||
[Route("[controller]")]
|
||||
public class JobsController : Controller
|
||||
{
|
||||
private readonly AppDbContext _db;
|
||||
private readonly JobSchedulerService _jobScheduler;
|
||||
private readonly IConfiguration _configuration;
|
||||
private readonly ILogger<JobsController> _logger;
|
||||
|
||||
public JobsController(
|
||||
AppDbContext db,
|
||||
JobSchedulerService jobScheduler,
|
||||
IConfiguration configuration,
|
||||
ILogger<JobsController> logger)
|
||||
{
|
||||
_db = db;
|
||||
_jobScheduler = jobScheduler;
|
||||
_configuration = configuration;
|
||||
_logger = logger;
|
||||
}
|
||||
|
||||
[HttpGet]
|
||||
[Route("")]
|
||||
public async Task<IActionResult> GetAll(
|
||||
[FromQuery] int start = 0,
|
||||
[FromQuery] int limit = 50,
|
||||
[FromQuery] List<JobStatus>? statuses = null,
|
||||
[FromQuery] JobType? jobType = null,
|
||||
[FromQuery] Guid? layerId = null)
|
||||
{
|
||||
try
|
||||
{
|
||||
// Validate pagination parameters
|
||||
if (limit <= 0 || limit > 1000)
|
||||
{
|
||||
return BadRequest("Limit must be between 1 and 1000");
|
||||
}
|
||||
if (start < 0)
|
||||
{
|
||||
return BadRequest("Start must be non-negative");
|
||||
}
|
||||
|
||||
var query = _db.QueueJobs.AsQueryable();
|
||||
|
||||
if (statuses != null && statuses.Count > 0)
|
||||
{
|
||||
query = query.Where(j => statuses.Contains(j.Status));
|
||||
}
|
||||
|
||||
if (jobType.HasValue)
|
||||
{
|
||||
query = query.Where(j => j.JobType == jobType.Value);
|
||||
}
|
||||
|
||||
if (layerId.HasValue)
|
||||
{
|
||||
query = query.Where(j => j.LayerId == layerId.Value);
|
||||
}
|
||||
|
||||
var totalCount = await query.CountAsync();
|
||||
|
||||
// Sort by: CreatedAt DESC (newest first), then Priority ASC (0=highest)
|
||||
var items = await query
|
||||
.OrderByDescending(j => j.CreatedAt)
|
||||
.ThenBy(j => j.Priority)
|
||||
.Skip(start)
|
||||
.Take(limit)
|
||||
.AsNoTracking()
|
||||
.ToListAsync();
|
||||
|
||||
var pagedResult = new PagedResult<QueueJob>
|
||||
{
|
||||
Items = items,
|
||||
TotalCount = totalCount,
|
||||
Page = (start / limit) + 1,
|
||||
PageSize = limit
|
||||
};
|
||||
|
||||
_logger.LogDebug("GetAll: Retrieved {Count} of {TotalCount} jobs", items.Count, totalCount);
|
||||
|
||||
return Ok(pagedResult);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "GetAll: Error retrieving jobs");
|
||||
return BadRequest("An error occurred while retrieving jobs");
|
||||
}
|
||||
}
|
||||
|
||||
[HttpGet]
|
||||
[Route("{id:guid}")]
|
||||
public async Task<IActionResult> Get(Guid id)
|
||||
{
|
||||
try
|
||||
{
|
||||
var job = await _db.QueueJobs
|
||||
.AsNoTracking()
|
||||
.FirstOrDefaultAsync(j => j.Id == id);
|
||||
|
||||
if (job == null)
|
||||
{
|
||||
_logger.LogWarning("Get: Job {JobId} not found", id);
|
||||
return NotFound("Job not found");
|
||||
}
|
||||
|
||||
_logger.LogDebug("Get: Retrieved job {JobId}", id);
|
||||
return Ok(job);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "Get: Error retrieving job {JobId}", id);
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
[HttpPost]
|
||||
[Route("schedule")]
|
||||
[AllowAnonymous] // Bypass controller-level [Authorize] to allow API key auth
|
||||
[ApiKeyAuth]
|
||||
public async Task<IActionResult> ScheduleJobs([FromQuery] string? nameFilter = null)
|
||||
{
|
||||
try
|
||||
{
|
||||
var jobsCreated = await _jobScheduler.ScheduleAllJobsAsync(nameFilter);
|
||||
|
||||
_logger.LogInformation("ScheduleJobs: Created {Count} jobs", jobsCreated);
|
||||
|
||||
return Ok(new
|
||||
{
|
||||
success = true,
|
||||
jobsCreated,
|
||||
message = $"Successfully scheduled {jobsCreated} jobs"
|
||||
});
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "ScheduleJobs: Error scheduling jobs");
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
[HttpPost]
|
||||
[Route("schedule/imports")]
|
||||
[AllowAnonymous] // Bypass controller-level [Authorize] to allow API key auth
|
||||
[ApiKeyAuth]
|
||||
public async Task<IActionResult> ScheduleImportJobs([FromQuery] string? nameFilter = null)
|
||||
{
|
||||
try
|
||||
{
|
||||
var jobsCreated = await _jobScheduler.ScheduleImportJobsAsync(nameFilter);
|
||||
|
||||
_logger.LogInformation("ScheduleImportJobs: Created {Count} import jobs", jobsCreated);
|
||||
|
||||
return Ok(new
|
||||
{
|
||||
success = true,
|
||||
jobsCreated,
|
||||
message = $"Successfully scheduled {jobsCreated} import jobs"
|
||||
});
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "ScheduleImportJobs: Error scheduling import jobs");
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
[HttpPost]
|
||||
[Route("schedule/processes")]
|
||||
[AllowAnonymous] // Bypass controller-level [Authorize] to allow API key auth
|
||||
[ApiKeyAuth]
|
||||
public async Task<IActionResult> ScheduleProcessJobs()
|
||||
{
|
||||
try
|
||||
{
|
||||
var jobsCreated = await _jobScheduler.ScheduleProcessJobsAsync();
|
||||
|
||||
_logger.LogInformation("ScheduleProcessJobs: Created {Count} process jobs", jobsCreated);
|
||||
|
||||
return Ok(new
|
||||
{
|
||||
success = true,
|
||||
jobsCreated,
|
||||
message = $"Successfully scheduled {jobsCreated} process jobs"
|
||||
});
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "ScheduleProcessJobs: Error scheduling process jobs");
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
// UI-friendly endpoints (JWT auth)
|
||||
[HttpPost]
|
||||
[Route("ui/schedule")]
|
||||
public async Task<IActionResult> ScheduleJobsUI([FromQuery] string? nameFilter = null)
|
||||
{
|
||||
try
|
||||
{
|
||||
var jobsCreated = await _jobScheduler.ScheduleAllJobsAsync(nameFilter);
|
||||
|
||||
_logger.LogInformation("ScheduleJobsUI: Created {Count} jobs by user {UserId}", jobsCreated, User.Identity?.Name);
|
||||
|
||||
return Ok(new
|
||||
{
|
||||
success = true,
|
||||
jobsCreated,
|
||||
message = $"Successfully scheduled {jobsCreated} jobs"
|
||||
});
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "ScheduleJobsUI: Error scheduling jobs");
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
[HttpPost]
|
||||
[Route("ui/schedule/imports")]
|
||||
public async Task<IActionResult> ScheduleImportJobsUI([FromQuery] string? nameFilter = null)
|
||||
{
|
||||
try
|
||||
{
|
||||
var jobsCreated = await _jobScheduler.ScheduleImportJobsAsync(nameFilter);
|
||||
|
||||
_logger.LogInformation("ScheduleImportJobsUI: Created {Count} import jobs by user {UserId}", jobsCreated, User.Identity?.Name);
|
||||
|
||||
return Ok(new
|
||||
{
|
||||
success = true,
|
||||
jobsCreated,
|
||||
message = $"Successfully scheduled {jobsCreated} import jobs"
|
||||
});
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "ScheduleImportJobsUI: Error scheduling import jobs");
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
[HttpPost]
|
||||
[Route("ui/schedule/processes")]
|
||||
public async Task<IActionResult> ScheduleProcessJobsUI()
|
||||
{
|
||||
try
|
||||
{
|
||||
var jobsCreated = await _jobScheduler.ScheduleProcessJobsAsync();
|
||||
|
||||
_logger.LogInformation("ScheduleProcessJobsUI: Created {Count} process jobs by user {UserId}", jobsCreated, User.Identity?.Name);
|
||||
|
||||
return Ok(new
|
||||
{
|
||||
success = true,
|
||||
jobsCreated,
|
||||
message = $"Successfully scheduled {jobsCreated} process jobs"
|
||||
});
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "ScheduleProcessJobsUI: Error scheduling process jobs");
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
[HttpPost]
|
||||
[Route("{id:guid}/retry")]
|
||||
public async Task<IActionResult> RetryJob(Guid id)
|
||||
{
|
||||
try
|
||||
{
|
||||
var job = await _db.QueueJobs.FirstOrDefaultAsync(j => j.Id == id);
|
||||
|
||||
if (job == null)
|
||||
{
|
||||
_logger.LogWarning("RetryJob: Job {JobId} not found", id);
|
||||
return NotFound("Job not found");
|
||||
}
|
||||
|
||||
if (job.Status != JobStatus.Failed)
|
||||
{
|
||||
_logger.LogWarning("RetryJob: Job {JobId} is not in Failed status (current: {Status})", id, job.Status);
|
||||
return BadRequest($"Job is not in Failed status (current: {job.Status})");
|
||||
}
|
||||
|
||||
job.Status = JobStatus.Pending;
|
||||
job.RetryCount = 0;
|
||||
job.LastError = null;
|
||||
job.ModifiedAt = DateTime.UtcNow;
|
||||
job.ModifiedById = DiunaBI.Domain.Entities.User.AutoImportUserId;
|
||||
|
||||
await _db.SaveChangesAsync();
|
||||
|
||||
_logger.LogInformation("RetryJob: Job {JobId} reset to Pending status", id);
|
||||
|
||||
return Ok(new
|
||||
{
|
||||
success = true,
|
||||
message = "Job reset to Pending status and will be retried"
|
||||
});
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "RetryJob: Error retrying job {JobId}", id);
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
[HttpDelete]
|
||||
[Route("{id:guid}")]
|
||||
public async Task<IActionResult> CancelJob(Guid id)
|
||||
{
|
||||
try
|
||||
{
|
||||
var job = await _db.QueueJobs.FirstOrDefaultAsync(j => j.Id == id);
|
||||
|
||||
if (job == null)
|
||||
{
|
||||
_logger.LogWarning("CancelJob: Job {JobId} not found", id);
|
||||
return NotFound("Job not found");
|
||||
}
|
||||
|
||||
if (job.Status == JobStatus.Running)
|
||||
{
|
||||
_logger.LogWarning("CancelJob: Cannot cancel running job {JobId}", id);
|
||||
return BadRequest("Cannot cancel a job that is currently running");
|
||||
}
|
||||
|
||||
if (job.Status == JobStatus.Completed)
|
||||
{
|
||||
_logger.LogWarning("CancelJob: Cannot cancel completed job {JobId}", id);
|
||||
return BadRequest("Cannot cancel a completed job");
|
||||
}
|
||||
|
||||
job.Status = JobStatus.Failed;
|
||||
job.LastError = "Cancelled by user";
|
||||
job.ModifiedAt = DateTime.UtcNow;
|
||||
job.ModifiedById = DiunaBI.Domain.Entities.User.AutoImportUserId;
|
||||
|
||||
await _db.SaveChangesAsync();
|
||||
|
||||
_logger.LogInformation("CancelJob: Job {JobId} cancelled", id);
|
||||
|
||||
return Ok(new
|
||||
{
|
||||
success = true,
|
||||
message = "Job cancelled successfully"
|
||||
});
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "CancelJob: Error cancelling job {JobId}", id);
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
[HttpGet]
|
||||
[Route("stats")]
|
||||
public async Task<IActionResult> GetStats()
|
||||
{
|
||||
try
|
||||
{
|
||||
var stats = new
|
||||
{
|
||||
pending = await _db.QueueJobs.CountAsync(j => j.Status == JobStatus.Pending),
|
||||
running = await _db.QueueJobs.CountAsync(j => j.Status == JobStatus.Running),
|
||||
completed = await _db.QueueJobs.CountAsync(j => j.Status == JobStatus.Completed),
|
||||
failed = await _db.QueueJobs.CountAsync(j => j.Status == JobStatus.Failed),
|
||||
retrying = await _db.QueueJobs.CountAsync(j => j.Status == JobStatus.Retrying),
|
||||
total = await _db.QueueJobs.CountAsync()
|
||||
};
|
||||
|
||||
_logger.LogDebug("GetStats: Retrieved job statistics");
|
||||
|
||||
return Ok(stats);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "GetStats: Error retrieving job statistics");
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
[HttpPost]
|
||||
[Route("create-for-layer/{layerId:guid}")]
|
||||
public async Task<IActionResult> CreateJobForLayer(Guid layerId)
|
||||
{
|
||||
try
|
||||
{
|
||||
var layer = await _db.Layers
|
||||
.Include(x => x.Records)
|
||||
.FirstOrDefaultAsync(l => l.Id == layerId);
|
||||
|
||||
if (layer == null)
|
||||
{
|
||||
_logger.LogWarning("CreateJobForLayer: Layer {LayerId} not found", layerId);
|
||||
return NotFound($"Layer {layerId} not found");
|
||||
}
|
||||
|
||||
if (layer.Type != LayerType.Administration)
|
||||
{
|
||||
_logger.LogWarning("CreateJobForLayer: Layer {LayerId} is not an Administration layer", layerId);
|
||||
return BadRequest("Only Administration layers can be run as jobs");
|
||||
}
|
||||
|
||||
// Get the Type record to determine if it's ImportWorker or ProcessWorker
|
||||
var typeRecord = layer.Records?.FirstOrDefault(x => x.Code == "Type");
|
||||
if (typeRecord?.Desc1 != "ImportWorker" && typeRecord?.Desc1 != "ProcessWorker")
|
||||
{
|
||||
_logger.LogWarning("CreateJobForLayer: Layer {LayerId} is not a valid worker type", layerId);
|
||||
return BadRequest("Layer must be an ImportWorker or ProcessWorker");
|
||||
}
|
||||
|
||||
// Check if enabled
|
||||
var isEnabledRecord = layer.Records?.FirstOrDefault(x => x.Code == "IsEnabled");
|
||||
if (isEnabledRecord?.Desc1 != "True")
|
||||
{
|
||||
_logger.LogWarning("CreateJobForLayer: Layer {LayerId} is not enabled", layerId);
|
||||
return BadRequest("Layer is not enabled");
|
||||
}
|
||||
|
||||
// Get plugin name
|
||||
var pluginRecord = layer.Records?.FirstOrDefault(x => x.Code == "Plugin");
|
||||
if (string.IsNullOrEmpty(pluginRecord?.Desc1))
|
||||
{
|
||||
_logger.LogWarning("CreateJobForLayer: Layer {LayerId} has no Plugin configured", layerId);
|
||||
return BadRequest("Layer has no Plugin configured");
|
||||
}
|
||||
|
||||
// Get priority and max retries
|
||||
var priorityRecord = layer.Records?.FirstOrDefault(x => x.Code == "Priority");
|
||||
var maxRetriesRecord = layer.Records?.FirstOrDefault(x => x.Code == "MaxRetries");
|
||||
|
||||
var priority = int.TryParse(priorityRecord?.Desc1, out var p) ? p : 0;
|
||||
var maxRetries = int.TryParse(maxRetriesRecord?.Desc1, out var m) ? m : 3;
|
||||
|
||||
var jobType = typeRecord.Desc1 == "ImportWorker" ? JobType.Import : JobType.Process;
|
||||
|
||||
// Check if there's already a pending/running job for this layer
|
||||
var existingJob = await _db.QueueJobs
|
||||
.Where(j => j.LayerId == layer.Id &&
|
||||
(j.Status == JobStatus.Pending || j.Status == JobStatus.Running))
|
||||
.FirstOrDefaultAsync();
|
||||
|
||||
if (existingJob != null)
|
||||
{
|
||||
_logger.LogInformation("CreateJobForLayer: Job already exists for layer {LayerId}, returning existing job", layerId);
|
||||
return Ok(new
|
||||
{
|
||||
success = true,
|
||||
jobId = existingJob.Id,
|
||||
message = "Job already exists for this layer",
|
||||
existing = true
|
||||
});
|
||||
}
|
||||
|
||||
// Create the job
|
||||
var job = new QueueJob
|
||||
{
|
||||
Id = Guid.NewGuid(),
|
||||
LayerId = layer.Id,
|
||||
LayerName = layer.Name ?? "Unknown",
|
||||
PluginName = pluginRecord.Desc1,
|
||||
JobType = jobType,
|
||||
Priority = priority,
|
||||
MaxRetries = maxRetries,
|
||||
Status = JobStatus.Pending,
|
||||
CreatedAt = DateTime.UtcNow,
|
||||
ModifiedAt = DateTime.UtcNow,
|
||||
CreatedById = DiunaBI.Domain.Entities.User.AutoImportUserId,
|
||||
ModifiedById = DiunaBI.Domain.Entities.User.AutoImportUserId
|
||||
};
|
||||
|
||||
_db.QueueJobs.Add(job);
|
||||
await _db.SaveChangesAsync();
|
||||
|
||||
_logger.LogInformation("CreateJobForLayer: Created job {JobId} for layer {LayerName} ({LayerId})",
|
||||
job.Id, layer.Name, layerId);
|
||||
|
||||
return Ok(new
|
||||
{
|
||||
success = true,
|
||||
jobId = job.Id,
|
||||
message = "Job created successfully",
|
||||
existing = false
|
||||
});
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "CreateJobForLayer: Error creating job for layer {LayerId}", layerId);
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -48,6 +48,16 @@ public class LayersController : Controller
|
||||
{
|
||||
try
|
||||
{
|
||||
// Validate pagination parameters
|
||||
if (limit <= 0 || limit > 1000)
|
||||
{
|
||||
return BadRequest("Limit must be between 1 and 1000");
|
||||
}
|
||||
if (start < 0)
|
||||
{
|
||||
return BadRequest("Start must be non-negative");
|
||||
}
|
||||
|
||||
var query = _db.Layers.Where(x => !x.IsDeleted);
|
||||
|
||||
if (name != null)
|
||||
@@ -99,7 +109,7 @@ public class LayersController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "GetAll: Error retrieving layers");
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
[HttpGet]
|
||||
@@ -119,7 +129,7 @@ public class LayersController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "Get: Error retrieving layer {LayerId}", id);
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
[HttpGet]
|
||||
@@ -396,7 +406,7 @@ public class LayersController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "AutoImport: Process error");
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -808,7 +818,7 @@ public class LayersController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "CreateRecord: Error creating record in layer {LayerId}", layerId);
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -889,7 +899,7 @@ public class LayersController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "UpdateRecord: Error updating record {RecordId} in layer {LayerId}", recordId, layerId);
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -944,7 +954,7 @@ public class LayersController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "DeleteRecord: Error deleting record {RecordId} from layer {LayerId}", recordId, layerId);
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -983,7 +993,7 @@ public class LayersController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "GetRecordHistory: Error retrieving history for record {RecordId}", recordId);
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1033,7 +1043,7 @@ public class LayersController : Controller
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "GetDeletedRecords: Error retrieving deleted records for layer {LayerId}", layerId);
|
||||
return BadRequest(e.ToString());
|
||||
return BadRequest("An error occurred processing your request");
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -20,7 +20,6 @@
|
||||
<PackageReference Include="Serilog.AspNetCore" Version="9.0.0" />
|
||||
<PackageReference Include="Serilog.Enrichers.Environment" Version="3.0.1" />
|
||||
<PackageReference Include="Serilog.Sinks.File" Version="7.0.0" />
|
||||
<PackageReference Include="Serilog.Sinks.Seq" Version="9.0.0" />
|
||||
<PackageReference Include="System.Configuration.ConfigurationManager" Version="10.0.0" />
|
||||
</ItemGroup>
|
||||
|
||||
@@ -37,11 +36,13 @@
|
||||
</Content>
|
||||
</ItemGroup>
|
||||
|
||||
<Target Name="CopyPlugins" AfterTargets="Build">
|
||||
<Target Name="CopyPlugins" AfterTargets="Build" Condition="'$(SkipPluginCopy)' != 'true'">
|
||||
<MSBuild Projects="../DiunaBI.Plugins.Morska/DiunaBI.Plugins.Morska.csproj" Properties="Configuration=$(Configuration);TargetFramework=$(TargetFramework)" />
|
||||
<MSBuild Projects="../DiunaBI.Plugins.PedrolloPL/DiunaBI.Plugins.PedrolloPL.csproj" Properties="Configuration=$(Configuration);TargetFramework=$(TargetFramework)" />
|
||||
|
||||
<ItemGroup>
|
||||
<PluginFiles Include="../DiunaBI.Plugins.Morska/bin/$(Configuration)/$(TargetFramework)/DiunaBI.Plugins.Morska.dll" />
|
||||
<PluginFiles Include="../DiunaBI.Plugins.PedrolloPL/bin/$(Configuration)/$(TargetFramework)/DiunaBI.Plugins.PedrolloPL.dll" />
|
||||
</ItemGroup>
|
||||
<MakeDir Directories="$(OutputPath)Plugins" />
|
||||
<Copy SourceFiles="@(PluginFiles)" DestinationFolder="$(OutputPath)Plugins" />
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
|
||||
# Stage 1: Build
|
||||
FROM mcr.microsoft.com/dotnet/sdk:10.0 AS build
|
||||
ARG PLUGIN_PROJECT=DiunaBI.Plugins.Morska
|
||||
WORKDIR /
|
||||
|
||||
# Copy solution and all project files for restore
|
||||
@@ -9,7 +10,7 @@ COPY DiunaBI.API/DiunaBI.API.csproj DiunaBI.API/
|
||||
COPY DiunaBI.Domain/DiunaBI.Domain.csproj DiunaBI.Domain/
|
||||
COPY DiunaBI.Application/DiunaBI.Application.csproj DiunaBI.Application/
|
||||
COPY DiunaBI.Infrastructure/DiunaBI.Infrastructure.csproj DiunaBI.Infrastructure/
|
||||
COPY DiunaBI.Plugins.Morska/DiunaBI.Plugins.Morska.csproj DiunaBI.Plugins.Morska/
|
||||
COPY ${PLUGIN_PROJECT}/${PLUGIN_PROJECT}.csproj ${PLUGIN_PROJECT}/
|
||||
|
||||
# Restore dependencies
|
||||
RUN dotnet restore DiunaBI.API/DiunaBI.API.csproj
|
||||
@@ -18,16 +19,16 @@ RUN dotnet restore DiunaBI.API/DiunaBI.API.csproj
|
||||
COPY . .
|
||||
|
||||
# Build plugin first
|
||||
WORKDIR /DiunaBI.Plugins.Morska
|
||||
WORKDIR /${PLUGIN_PROJECT}
|
||||
RUN dotnet build -c Release
|
||||
|
||||
# Build and publish API
|
||||
# Build and publish API (skip automatic plugin copy since we handle it manually)
|
||||
WORKDIR /DiunaBI.API
|
||||
RUN dotnet publish -c Release -o /app/publish --no-restore
|
||||
RUN dotnet publish -c Release -o /app/publish --no-restore -p:SkipPluginCopy=true
|
||||
|
||||
# Copy plugin DLL to publish output
|
||||
RUN mkdir -p /app/publish/Plugins && \
|
||||
cp /DiunaBI.Plugins.Morska/bin/Release/net10.0/DiunaBI.Plugins.Morska.dll /app/publish/Plugins/
|
||||
cp /${PLUGIN_PROJECT}/bin/Release/net10.0/${PLUGIN_PROJECT}.dll /app/publish/Plugins/
|
||||
|
||||
# Stage 2: Runtime
|
||||
FROM mcr.microsoft.com/dotnet/aspnet:10.0 AS runtime
|
||||
|
||||
15
DiunaBI.API/Hubs/EntityChangeHub.cs
Normal file
15
DiunaBI.API/Hubs/EntityChangeHub.cs
Normal file
@@ -0,0 +1,15 @@
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
using Microsoft.AspNetCore.SignalR;
|
||||
|
||||
namespace DiunaBI.API.Hubs;
|
||||
|
||||
/// <summary>
|
||||
/// SignalR hub for broadcasting entity change notifications to authenticated clients.
|
||||
/// Clients can only listen - broadcasting is done server-side by EntityChangeInterceptor.
|
||||
/// </summary>
|
||||
[Authorize]
|
||||
public class EntityChangeHub : Hub
|
||||
{
|
||||
// No public methods - clients can only listen for "EntityChanged" events
|
||||
// Broadcasting is handled server-side by EntityChangeInterceptor via IHubContext
|
||||
}
|
||||
@@ -1,11 +1,15 @@
|
||||
using Microsoft.AspNetCore.Authentication.JwtBearer;
|
||||
using Microsoft.AspNetCore.RateLimiting;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.IdentityModel.Tokens;
|
||||
using System.IdentityModel.Tokens.Jwt;
|
||||
using System.Reflection;
|
||||
using System.Text;
|
||||
using System.Threading.RateLimiting;
|
||||
using DiunaBI.API.Hubs;
|
||||
using DiunaBI.API.Services;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Interceptors;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Google.Apis.Sheets.v4;
|
||||
using Serilog;
|
||||
@@ -29,10 +33,22 @@ if (builder.Environment.IsProduction())
|
||||
|
||||
var connectionString = builder.Configuration.GetConnectionString("SQLDatabase");
|
||||
|
||||
builder.Services.AddDbContext<AppDbContext>(x =>
|
||||
// Register EntityChangeInterceptor
|
||||
builder.Services.AddSingleton<EntityChangeInterceptor>();
|
||||
|
||||
builder.Services.AddDbContext<AppDbContext>((serviceProvider, options) =>
|
||||
{
|
||||
x.UseSqlServer(connectionString, sqlOptions => sqlOptions.MigrationsAssembly("DiunaBI.Infrastructure"));
|
||||
x.EnableSensitiveDataLogging();
|
||||
options.UseSqlServer(connectionString, sqlOptions => sqlOptions.MigrationsAssembly("DiunaBI.Infrastructure"));
|
||||
|
||||
// Only log SQL parameters in development (may contain sensitive data)
|
||||
if (builder.Environment.IsDevelopment())
|
||||
{
|
||||
options.EnableSensitiveDataLogging();
|
||||
}
|
||||
|
||||
// Add EntityChangeInterceptor
|
||||
var interceptor = serviceProvider.GetRequiredService<EntityChangeInterceptor>();
|
||||
options.AddInterceptors(interceptor);
|
||||
});
|
||||
|
||||
builder.Services.AddCors(options =>
|
||||
@@ -58,6 +74,44 @@ builder.Services.AddCors(options =>
|
||||
|
||||
builder.Services.AddControllers();
|
||||
|
||||
// Rate Limiting
|
||||
builder.Services.AddRateLimiter(options =>
|
||||
{
|
||||
// Global API rate limit
|
||||
options.AddFixedWindowLimiter("api", config =>
|
||||
{
|
||||
config.PermitLimit = 100;
|
||||
config.Window = TimeSpan.FromMinutes(1);
|
||||
config.QueueProcessingOrder = System.Threading.RateLimiting.QueueProcessingOrder.OldestFirst;
|
||||
config.QueueLimit = 0; // No queueing
|
||||
});
|
||||
|
||||
// Strict limit for authentication endpoint
|
||||
options.AddFixedWindowLimiter("auth", config =>
|
||||
{
|
||||
config.PermitLimit = 10;
|
||||
config.Window = TimeSpan.FromMinutes(1);
|
||||
config.QueueProcessingOrder = System.Threading.RateLimiting.QueueProcessingOrder.OldestFirst;
|
||||
config.QueueLimit = 0;
|
||||
});
|
||||
|
||||
// Rejection response
|
||||
options.OnRejected = async (context, token) =>
|
||||
{
|
||||
context.HttpContext.Response.StatusCode = 429; // Too Many Requests
|
||||
await context.HttpContext.Response.WriteAsJsonAsync(new
|
||||
{
|
||||
error = "Too many requests. Please try again later.",
|
||||
retryAfter = context.Lease.TryGetMetadata(MetadataName.RetryAfter, out var retryAfter)
|
||||
? (double?)retryAfter.TotalSeconds
|
||||
: (double?)null
|
||||
}, cancellationToken: token);
|
||||
};
|
||||
});
|
||||
|
||||
// SignalR
|
||||
builder.Services.AddSignalR();
|
||||
|
||||
builder.Services.AddAuthentication(options =>
|
||||
{
|
||||
options.DefaultAuthenticateScheme = JwtBearerDefaults.AuthenticationScheme;
|
||||
@@ -67,10 +121,12 @@ builder.Services.AddAuthentication(options =>
|
||||
{
|
||||
options.TokenValidationParameters = new TokenValidationParameters
|
||||
{
|
||||
ValidateIssuer = false,
|
||||
ValidateAudience = false,
|
||||
ValidateIssuer = true,
|
||||
ValidateAudience = true,
|
||||
ValidateLifetime = true,
|
||||
ValidateIssuerSigningKey = true,
|
||||
ValidIssuer = builder.Configuration["JwtSettings:Issuer"],
|
||||
ValidAudience = builder.Configuration["JwtSettings:Audience"],
|
||||
IssuerSigningKey = new SymmetricSecurityKey(Encoding.UTF8.GetBytes(builder.Configuration["JwtSettings:SecurityKey"]!))
|
||||
};
|
||||
});
|
||||
@@ -97,6 +153,10 @@ builder.Services.AddSingleton<SpreadsheetsResource.ValuesResource>(provider =>
|
||||
|
||||
builder.Services.AddSingleton<PluginManager>();
|
||||
|
||||
// Job Queue Services
|
||||
builder.Services.AddScoped<JobSchedulerService>();
|
||||
builder.Services.AddHostedService<JobWorkerService>();
|
||||
|
||||
var app = builder.Build();
|
||||
|
||||
// Auto-apply migrations on startup
|
||||
@@ -179,6 +239,18 @@ pluginManager.LoadPluginsFromDirectory(pluginsPath);
|
||||
|
||||
app.UseCors("CORSPolicy");
|
||||
|
||||
// Security Headers
|
||||
app.Use(async (context, next) =>
|
||||
{
|
||||
context.Response.Headers.Append("X-Content-Type-Options", "nosniff");
|
||||
context.Response.Headers.Append("X-Frame-Options", "DENY");
|
||||
context.Response.Headers.Append("X-XSS-Protection", "1; mode=block");
|
||||
context.Response.Headers.Append("Referrer-Policy", "strict-origin-when-cross-origin");
|
||||
await next();
|
||||
});
|
||||
|
||||
app.UseRateLimiter();
|
||||
|
||||
app.UseAuthentication();
|
||||
app.UseAuthorization();
|
||||
|
||||
@@ -230,16 +302,15 @@ app.Use(async (context, next) =>
|
||||
logger.LogError(ex, "❌ Failed to extract UserId from JWT token");
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
logger.LogWarning("❌ No valid Bearer token found");
|
||||
}
|
||||
|
||||
await next(context);
|
||||
});
|
||||
|
||||
app.MapControllers();
|
||||
|
||||
// SignalR Hub - Requires JWT authentication
|
||||
app.MapHub<EntityChangeHub>("/hubs/entitychanges").RequireAuthorization();
|
||||
|
||||
app.MapGet("/health", () => Results.Ok(new { status = "OK", timestamp = DateTime.UtcNow }))
|
||||
.AllowAnonymous();
|
||||
|
||||
|
||||
@@ -36,7 +36,7 @@ public class GoogleAuthService(AppDbContext context, IConfiguration configuratio
|
||||
if (user == null)
|
||||
{
|
||||
_logger.LogError("User not found in DiunaBI database: {Email}", payload.Email);
|
||||
return (false, null, "User not found in DiunaBI database");
|
||||
return (false, null, "Authentication failed");
|
||||
}
|
||||
|
||||
user.UserName = payload.Name;
|
||||
|
||||
@@ -52,7 +52,7 @@ public class JwtTokenService(IConfiguration configuration, ILogger<JwtTokenServi
|
||||
try
|
||||
{
|
||||
var jwtSettings = _configuration.GetSection("JwtSettings");
|
||||
var secretKey = jwtSettings["SecretKey"];
|
||||
var secretKey = jwtSettings["SecurityKey"];
|
||||
var issuer = jwtSettings["Issuer"];
|
||||
var audience = jwtSettings["Audience"];
|
||||
|
||||
|
||||
@@ -12,6 +12,7 @@ public class QueueJob
|
||||
public JobType JobType { get; set; }
|
||||
public int Priority { get; set; } = 0; // 0 = highest priority
|
||||
public DateTime CreatedAt { get; set; } = DateTime.UtcNow;
|
||||
public DateTime ModifiedAt { get; set; } = DateTime.UtcNow;
|
||||
public int RetryCount { get; set; } = 0;
|
||||
public int MaxRetries { get; set; } = 5;
|
||||
public JobStatus Status { get; set; } = JobStatus.Pending;
|
||||
@@ -19,9 +20,7 @@ public class QueueJob
|
||||
public DateTime? LastAttemptAt { get; set; }
|
||||
public DateTime? CompletedAt { get; set; }
|
||||
public Guid CreatedById { get; set; }
|
||||
public DateTime CreatedAtUtc { get; set; } = DateTime.UtcNow;
|
||||
public Guid ModifiedById { get; set; }
|
||||
public DateTime ModifiedAtUtc { get; set; } = DateTime.UtcNow;
|
||||
}
|
||||
|
||||
public enum JobType
|
||||
|
||||
@@ -5,6 +5,11 @@ namespace DiunaBI.Domain.Entities;
|
||||
|
||||
public class User
|
||||
{
|
||||
/// <summary>
|
||||
/// System user ID for automated operations (imports, scheduled jobs, etc.)
|
||||
/// </summary>
|
||||
public static readonly Guid AutoImportUserId = Guid.Parse("f392209e-123e-4651-a5a4-0b1d6cf9ff9d");
|
||||
|
||||
#region Properties
|
||||
public Guid Id { get; init; }
|
||||
public string? Email { get; init; }
|
||||
|
||||
@@ -136,9 +136,8 @@ public class AppDbContext(DbContextOptions<AppDbContext> options) : DbContext(op
|
||||
modelBuilder.Entity<QueueJob>().Property(x => x.LastAttemptAt);
|
||||
modelBuilder.Entity<QueueJob>().Property(x => x.CompletedAt);
|
||||
modelBuilder.Entity<QueueJob>().Property(x => x.CreatedById).IsRequired();
|
||||
modelBuilder.Entity<QueueJob>().Property(x => x.CreatedAtUtc).IsRequired();
|
||||
modelBuilder.Entity<QueueJob>().Property(x => x.ModifiedById).IsRequired();
|
||||
modelBuilder.Entity<QueueJob>().Property(x => x.ModifiedAtUtc).IsRequired();
|
||||
modelBuilder.Entity<QueueJob>().Property(x => x.ModifiedAt).IsRequired();
|
||||
|
||||
// Configure automatic timestamps for entities with CreatedAt/ModifiedAt
|
||||
ConfigureTimestamps(modelBuilder);
|
||||
|
||||
@@ -22,7 +22,10 @@
|
||||
<PackageReference Include="Microsoft.EntityFrameworkCore.SqlServer" Version="10.0.0" />
|
||||
<PackageReference Include="Google.Apis.Sheets.v4" Version="1.68.0.3525" />
|
||||
<PackageReference Include="Google.Apis.Drive.v3" Version="1.68.0.3490" />
|
||||
<PackageReference Include="Microsoft.Extensions.Configuration.Json" Version="10.0.0" />
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<FrameworkReference Include="Microsoft.AspNetCore.App" />
|
||||
</ItemGroup>
|
||||
|
||||
</Project>
|
||||
201
DiunaBI.Infrastructure/Interceptors/EntityChangeInterceptor.cs
Normal file
201
DiunaBI.Infrastructure/Interceptors/EntityChangeInterceptor.cs
Normal file
@@ -0,0 +1,201 @@
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.EntityFrameworkCore.Diagnostics;
|
||||
using Microsoft.AspNetCore.SignalR;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Infrastructure.Interceptors;
|
||||
|
||||
public class EntityChangeInterceptor : SaveChangesInterceptor
|
||||
{
|
||||
private readonly object? _hubContext;
|
||||
private readonly ILogger<EntityChangeInterceptor>? _logger;
|
||||
private readonly List<(string Module, string Id, string Operation)> _pendingChanges = new();
|
||||
|
||||
public EntityChangeInterceptor(IServiceProvider serviceProvider)
|
||||
{
|
||||
_logger = serviceProvider.GetService(typeof(ILogger<EntityChangeInterceptor>)) as ILogger<EntityChangeInterceptor>;
|
||||
|
||||
// Try to get hub context - it may not be registered in some scenarios (e.g., migrations)
|
||||
try
|
||||
{
|
||||
var hubType = Type.GetType("DiunaBI.API.Hubs.EntityChangeHub, DiunaBI.API");
|
||||
if (hubType != null)
|
||||
{
|
||||
var hubContextType = typeof(IHubContext<>).MakeGenericType(hubType);
|
||||
_hubContext = serviceProvider.GetService(hubContextType);
|
||||
|
||||
if (_hubContext != null)
|
||||
{
|
||||
_logger?.LogInformation("✅ EntityChangeInterceptor: Hub context initialized");
|
||||
Console.WriteLine("✅ EntityChangeInterceptor: Hub context initialized");
|
||||
}
|
||||
else
|
||||
{
|
||||
_logger?.LogWarning("⚠️ EntityChangeInterceptor: Hub context is null");
|
||||
Console.WriteLine("⚠️ EntityChangeInterceptor: Hub context is null");
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
_logger?.LogWarning("⚠️ EntityChangeInterceptor: Hub type not found");
|
||||
Console.WriteLine("⚠️ EntityChangeInterceptor: Hub type not found");
|
||||
}
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger?.LogError(ex, "❌ EntityChangeInterceptor: Failed to initialize hub context");
|
||||
Console.WriteLine($"❌ EntityChangeInterceptor: Failed to initialize hub context: {ex.Message}");
|
||||
_hubContext = null;
|
||||
}
|
||||
}
|
||||
|
||||
public override ValueTask<InterceptionResult<int>> SavingChangesAsync(
|
||||
DbContextEventData eventData,
|
||||
InterceptionResult<int> result,
|
||||
CancellationToken cancellationToken = default)
|
||||
{
|
||||
_pendingChanges.Clear();
|
||||
|
||||
Console.WriteLine($"🔍 EntityChangeInterceptor.SavingChangesAsync called. HubContext null? {_hubContext == null}, Context null? {eventData.Context == null}");
|
||||
|
||||
if (_hubContext != null && eventData.Context != null)
|
||||
{
|
||||
// Capture changes BEFORE save
|
||||
var entries = eventData.Context.ChangeTracker.Entries().ToList();
|
||||
Console.WriteLine($"🔍 Found {entries.Count} total entries in ChangeTracker");
|
||||
|
||||
foreach (var entry in entries)
|
||||
{
|
||||
Console.WriteLine($"🔍 Entry: {entry.Metadata.ClrType.Name}, State: {entry.State}");
|
||||
|
||||
if (entry.State == EntityState.Added ||
|
||||
entry.State == EntityState.Modified ||
|
||||
entry.State == EntityState.Deleted)
|
||||
{
|
||||
var module = entry.Metadata.GetTableName() ?? entry.Metadata.ClrType.Name;
|
||||
var id = GetEntityId(entry);
|
||||
var operation = entry.State switch
|
||||
{
|
||||
EntityState.Added => "created",
|
||||
EntityState.Modified => "updated",
|
||||
EntityState.Deleted => "deleted",
|
||||
_ => "unknown"
|
||||
};
|
||||
|
||||
Console.WriteLine($"🔍 Detected change: {module} {id} {operation}");
|
||||
|
||||
if (id != null)
|
||||
{
|
||||
_pendingChanges.Add((module, id, operation));
|
||||
Console.WriteLine($"✅ Added to pending changes: {module} {id} {operation}");
|
||||
}
|
||||
else
|
||||
{
|
||||
Console.WriteLine($"⚠️ Skipped (id is null): {module} {operation}");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Console.WriteLine($"🔍 Total pending changes: {_pendingChanges.Count}");
|
||||
}
|
||||
|
||||
return base.SavingChangesAsync(eventData, result, cancellationToken);
|
||||
}
|
||||
|
||||
public override async ValueTask<int> SavedChangesAsync(
|
||||
SaveChangesCompletedEventData eventData,
|
||||
int result,
|
||||
CancellationToken cancellationToken = default)
|
||||
{
|
||||
// Broadcast changes AFTER successful save
|
||||
if (_hubContext != null && result > 0 && _pendingChanges.Any())
|
||||
{
|
||||
_logger?.LogInformation("📤 Broadcasting {Count} entity changes via SignalR", _pendingChanges.Count);
|
||||
Console.WriteLine($"📤 Broadcasting {_pendingChanges.Count} entity changes via SignalR");
|
||||
|
||||
foreach (var (module, id, operation) in _pendingChanges)
|
||||
{
|
||||
try
|
||||
{
|
||||
Console.WriteLine($"📤 Broadcasting: {module} {id} {operation}");
|
||||
|
||||
// Use reflection to call hub methods since we can't reference the API project
|
||||
var clientsProperty = _hubContext.GetType().GetProperty("Clients");
|
||||
Console.WriteLine($" 🔍 Clients property: {clientsProperty != null}");
|
||||
|
||||
if (clientsProperty != null)
|
||||
{
|
||||
var clients = clientsProperty.GetValue(_hubContext);
|
||||
Console.WriteLine($" 🔍 Clients value: {clients != null}, Type: {clients?.GetType().Name}");
|
||||
|
||||
if (clients != null)
|
||||
{
|
||||
var allProperty = clients.GetType().GetProperty("All");
|
||||
Console.WriteLine($" 🔍 All property: {allProperty != null}");
|
||||
|
||||
if (allProperty != null)
|
||||
{
|
||||
var allClients = allProperty.GetValue(clients);
|
||||
Console.WriteLine($" 🔍 AllClients value: {allClients != null}, Type: {allClients?.GetType().Name}");
|
||||
|
||||
if (allClients != null)
|
||||
{
|
||||
// SendAsync is an extension method, so we need to find it differently
|
||||
// Look for the IClientProxy interface which has SendCoreAsync
|
||||
var sendCoreAsyncMethod = allClients.GetType().GetMethod("SendCoreAsync");
|
||||
Console.WriteLine($" 🔍 SendCoreAsync method found: {sendCoreAsyncMethod != null}");
|
||||
|
||||
if (sendCoreAsyncMethod != null)
|
||||
{
|
||||
// SendCoreAsync takes (string method, object?[] args, CancellationToken cancellationToken)
|
||||
var task = sendCoreAsyncMethod.Invoke(allClients, new object[]
|
||||
{
|
||||
"EntityChanged",
|
||||
new object[] { new { module, id, operation } },
|
||||
cancellationToken
|
||||
}) as Task;
|
||||
|
||||
Console.WriteLine($" 🔍 Task created: {task != null}");
|
||||
|
||||
if (task != null)
|
||||
{
|
||||
await task;
|
||||
Console.WriteLine($"✅ Broadcast successful: {module} {id} {operation}");
|
||||
}
|
||||
else
|
||||
{
|
||||
Console.WriteLine($"❌ Task is null after invoke");
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
Console.WriteLine($"❌ SendCoreAsync method not found");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger?.LogError(ex, "❌ Failed to broadcast entity change");
|
||||
Console.WriteLine($"❌ Failed to broadcast: {ex.Message}");
|
||||
Console.WriteLine($"❌ Stack trace: {ex.StackTrace}");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
_pendingChanges.Clear();
|
||||
return await base.SavedChangesAsync(eventData, result, cancellationToken);
|
||||
}
|
||||
|
||||
private static string? GetEntityId(Microsoft.EntityFrameworkCore.ChangeTracking.EntityEntry entry)
|
||||
{
|
||||
var keyProperty = entry.Metadata.FindPrimaryKey()?.Properties.FirstOrDefault();
|
||||
if (keyProperty == null)
|
||||
return null;
|
||||
|
||||
var value = entry.Property(keyProperty.Name).CurrentValue;
|
||||
return value?.ToString();
|
||||
}
|
||||
}
|
||||
489
DiunaBI.Infrastructure/Migrations/20251208205202_RemoveQueueJobDuplicateUTCFields.Designer.cs
generated
Normal file
489
DiunaBI.Infrastructure/Migrations/20251208205202_RemoveQueueJobDuplicateUTCFields.Designer.cs
generated
Normal file
@@ -0,0 +1,489 @@
|
||||
// <auto-generated />
|
||||
using System;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.EntityFrameworkCore.Infrastructure;
|
||||
using Microsoft.EntityFrameworkCore.Metadata;
|
||||
using Microsoft.EntityFrameworkCore.Migrations;
|
||||
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
|
||||
|
||||
#nullable disable
|
||||
|
||||
namespace DiunaBI.Infrastructure.Migrations
|
||||
{
|
||||
[DbContext(typeof(AppDbContext))]
|
||||
[Migration("20251208205202_RemoveQueueJobDuplicateUTCFields")]
|
||||
partial class RemoveQueueJobDuplicateUTCFields
|
||||
{
|
||||
/// <inheritdoc />
|
||||
protected override void BuildTargetModel(ModelBuilder modelBuilder)
|
||||
{
|
||||
#pragma warning disable 612, 618
|
||||
modelBuilder
|
||||
.HasAnnotation("ProductVersion", "10.0.0")
|
||||
.HasAnnotation("Relational:MaxIdentifierLength", 128);
|
||||
|
||||
SqlServerModelBuilderExtensions.UseIdentityColumns(modelBuilder);
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.DataInbox", b =>
|
||||
{
|
||||
b.Property<Guid>("Id")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<DateTime>("CreatedAt")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("datetime2")
|
||||
.HasDefaultValueSql("GETUTCDATE()");
|
||||
|
||||
b.Property<string>("Data")
|
||||
.IsRequired()
|
||||
.HasColumnType("nvarchar(max)");
|
||||
|
||||
b.Property<string>("Name")
|
||||
.IsRequired()
|
||||
.HasMaxLength(50)
|
||||
.HasColumnType("nvarchar(50)");
|
||||
|
||||
b.Property<string>("Source")
|
||||
.IsRequired()
|
||||
.HasMaxLength(50)
|
||||
.HasColumnType("nvarchar(50)");
|
||||
|
||||
b.HasKey("Id");
|
||||
|
||||
b.ToTable("DataInbox");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.Layer", b =>
|
||||
{
|
||||
b.Property<Guid>("Id")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<DateTime>("CreatedAt")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("datetime2")
|
||||
.HasDefaultValueSql("GETUTCDATE()");
|
||||
|
||||
b.Property<Guid>("CreatedById")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<bool>("IsCancelled")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("bit")
|
||||
.HasDefaultValue(false);
|
||||
|
||||
b.Property<bool>("IsDeleted")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("bit")
|
||||
.HasDefaultValue(false);
|
||||
|
||||
b.Property<DateTime>("ModifiedAt")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("datetime2")
|
||||
.HasDefaultValueSql("GETUTCDATE()");
|
||||
|
||||
b.Property<Guid>("ModifiedById")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<string>("Name")
|
||||
.IsRequired()
|
||||
.HasMaxLength(50)
|
||||
.HasColumnType("nvarchar(50)");
|
||||
|
||||
b.Property<int>("Number")
|
||||
.HasColumnType("int");
|
||||
|
||||
b.Property<Guid?>("ParentId")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<int>("Type")
|
||||
.HasColumnType("int");
|
||||
|
||||
b.HasKey("Id");
|
||||
|
||||
b.HasIndex("CreatedById");
|
||||
|
||||
b.HasIndex("ModifiedById");
|
||||
|
||||
b.ToTable("Layers");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.ProcessSource", b =>
|
||||
{
|
||||
b.Property<Guid>("LayerId")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<Guid>("SourceId")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.HasKey("LayerId", "SourceId");
|
||||
|
||||
b.HasIndex("SourceId");
|
||||
|
||||
b.ToTable("ProcessSources");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.QueueJob", b =>
|
||||
{
|
||||
b.Property<Guid>("Id")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<DateTime?>("CompletedAt")
|
||||
.HasColumnType("datetime2");
|
||||
|
||||
b.Property<DateTime>("CreatedAt")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("datetime2")
|
||||
.HasDefaultValueSql("GETUTCDATE()");
|
||||
|
||||
b.Property<Guid>("CreatedById")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<int>("JobType")
|
||||
.HasColumnType("int");
|
||||
|
||||
b.Property<DateTime?>("LastAttemptAt")
|
||||
.HasColumnType("datetime2");
|
||||
|
||||
b.Property<string>("LastError")
|
||||
.HasMaxLength(1000)
|
||||
.HasColumnType("nvarchar(1000)");
|
||||
|
||||
b.Property<Guid>("LayerId")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<string>("LayerName")
|
||||
.IsRequired()
|
||||
.HasMaxLength(200)
|
||||
.HasColumnType("nvarchar(200)");
|
||||
|
||||
b.Property<int>("MaxRetries")
|
||||
.HasColumnType("int");
|
||||
|
||||
b.Property<DateTime>("ModifiedAt")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("datetime2")
|
||||
.HasDefaultValueSql("GETUTCDATE()");
|
||||
|
||||
b.Property<Guid>("ModifiedById")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<string>("PluginName")
|
||||
.IsRequired()
|
||||
.HasMaxLength(100)
|
||||
.HasColumnType("nvarchar(100)");
|
||||
|
||||
b.Property<int>("Priority")
|
||||
.HasColumnType("int");
|
||||
|
||||
b.Property<int>("RetryCount")
|
||||
.HasColumnType("int");
|
||||
|
||||
b.Property<int>("Status")
|
||||
.HasColumnType("int");
|
||||
|
||||
b.HasKey("Id");
|
||||
|
||||
b.ToTable("QueueJobs");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.Record", b =>
|
||||
{
|
||||
b.Property<Guid>("Id")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<string>("Code")
|
||||
.IsRequired()
|
||||
.HasMaxLength(50)
|
||||
.HasColumnType("nvarchar(50)");
|
||||
|
||||
b.Property<DateTime>("CreatedAt")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("datetime2")
|
||||
.HasDefaultValueSql("GETUTCDATE()");
|
||||
|
||||
b.Property<Guid>("CreatedById")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<string>("Desc1")
|
||||
.HasMaxLength(10000)
|
||||
.HasColumnType("nvarchar(max)");
|
||||
|
||||
b.Property<bool>("IsDeleted")
|
||||
.HasColumnType("bit");
|
||||
|
||||
b.Property<Guid>("LayerId")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<DateTime>("ModifiedAt")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("datetime2")
|
||||
.HasDefaultValueSql("GETUTCDATE()");
|
||||
|
||||
b.Property<Guid>("ModifiedById")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<double?>("Value1")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value10")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value11")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value12")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value13")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value14")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value15")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value16")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value17")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value18")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value19")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value2")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value20")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value21")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value22")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value23")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value24")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value25")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value26")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value27")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value28")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value29")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value3")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value30")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value31")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value32")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value4")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value5")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value6")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value7")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value8")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.Property<double?>("Value9")
|
||||
.HasColumnType("float");
|
||||
|
||||
b.HasKey("Id");
|
||||
|
||||
b.HasIndex("CreatedById");
|
||||
|
||||
b.HasIndex("LayerId");
|
||||
|
||||
b.HasIndex("ModifiedById");
|
||||
|
||||
b.ToTable("Records");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.RecordHistory", b =>
|
||||
{
|
||||
b.Property<Guid>("Id")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<int>("ChangeType")
|
||||
.HasColumnType("int");
|
||||
|
||||
b.Property<DateTime>("ChangedAt")
|
||||
.HasColumnType("datetime2");
|
||||
|
||||
b.Property<Guid>("ChangedById")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<string>("ChangedFields")
|
||||
.HasMaxLength(200)
|
||||
.HasColumnType("nvarchar(200)");
|
||||
|
||||
b.Property<string>("ChangesSummary")
|
||||
.HasMaxLength(4000)
|
||||
.HasColumnType("nvarchar(4000)");
|
||||
|
||||
b.Property<string>("Code")
|
||||
.IsRequired()
|
||||
.HasMaxLength(50)
|
||||
.HasColumnType("nvarchar(50)");
|
||||
|
||||
b.Property<string>("Desc1")
|
||||
.HasMaxLength(10000)
|
||||
.HasColumnType("nvarchar(max)");
|
||||
|
||||
b.Property<Guid>("LayerId")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<Guid>("RecordId")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.HasKey("Id");
|
||||
|
||||
b.HasIndex("ChangedById");
|
||||
|
||||
b.HasIndex("LayerId", "ChangedAt");
|
||||
|
||||
b.HasIndex("RecordId", "ChangedAt");
|
||||
|
||||
b.ToTable("RecordHistory");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.User", b =>
|
||||
{
|
||||
b.Property<Guid>("Id")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
b.Property<DateTime>("CreatedAt")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("datetime2")
|
||||
.HasDefaultValueSql("GETUTCDATE()");
|
||||
|
||||
b.Property<string>("Email")
|
||||
.HasMaxLength(50)
|
||||
.HasColumnType("nvarchar(50)");
|
||||
|
||||
b.Property<string>("UserName")
|
||||
.HasMaxLength(50)
|
||||
.HasColumnType("nvarchar(50)");
|
||||
|
||||
b.HasKey("Id");
|
||||
|
||||
b.ToTable("Users");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.Layer", b =>
|
||||
{
|
||||
b.HasOne("DiunaBI.Domain.Entities.User", "CreatedBy")
|
||||
.WithMany()
|
||||
.HasForeignKey("CreatedById")
|
||||
.OnDelete(DeleteBehavior.Restrict)
|
||||
.IsRequired();
|
||||
|
||||
b.HasOne("DiunaBI.Domain.Entities.User", "ModifiedBy")
|
||||
.WithMany()
|
||||
.HasForeignKey("ModifiedById")
|
||||
.OnDelete(DeleteBehavior.Restrict)
|
||||
.IsRequired();
|
||||
|
||||
b.Navigation("CreatedBy");
|
||||
|
||||
b.Navigation("ModifiedBy");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.ProcessSource", b =>
|
||||
{
|
||||
b.HasOne("DiunaBI.Domain.Entities.Layer", null)
|
||||
.WithMany()
|
||||
.HasForeignKey("LayerId")
|
||||
.OnDelete(DeleteBehavior.Cascade)
|
||||
.IsRequired();
|
||||
|
||||
b.HasOne("DiunaBI.Domain.Entities.Layer", "Source")
|
||||
.WithMany()
|
||||
.HasForeignKey("SourceId")
|
||||
.OnDelete(DeleteBehavior.Restrict)
|
||||
.IsRequired();
|
||||
|
||||
b.Navigation("Source");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.Record", b =>
|
||||
{
|
||||
b.HasOne("DiunaBI.Domain.Entities.User", "CreatedBy")
|
||||
.WithMany()
|
||||
.HasForeignKey("CreatedById")
|
||||
.OnDelete(DeleteBehavior.Restrict)
|
||||
.IsRequired();
|
||||
|
||||
b.HasOne("DiunaBI.Domain.Entities.Layer", null)
|
||||
.WithMany("Records")
|
||||
.HasForeignKey("LayerId")
|
||||
.OnDelete(DeleteBehavior.Cascade)
|
||||
.IsRequired();
|
||||
|
||||
b.HasOne("DiunaBI.Domain.Entities.User", "ModifiedBy")
|
||||
.WithMany()
|
||||
.HasForeignKey("ModifiedById")
|
||||
.OnDelete(DeleteBehavior.Restrict)
|
||||
.IsRequired();
|
||||
|
||||
b.Navigation("CreatedBy");
|
||||
|
||||
b.Navigation("ModifiedBy");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.RecordHistory", b =>
|
||||
{
|
||||
b.HasOne("DiunaBI.Domain.Entities.User", "ChangedBy")
|
||||
.WithMany()
|
||||
.HasForeignKey("ChangedById")
|
||||
.OnDelete(DeleteBehavior.Restrict)
|
||||
.IsRequired();
|
||||
|
||||
b.Navigation("ChangedBy");
|
||||
});
|
||||
|
||||
modelBuilder.Entity("DiunaBI.Domain.Entities.Layer", b =>
|
||||
{
|
||||
b.Navigation("Records");
|
||||
});
|
||||
#pragma warning restore 612, 618
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,52 @@
|
||||
using System;
|
||||
using Microsoft.EntityFrameworkCore.Migrations;
|
||||
|
||||
#nullable disable
|
||||
|
||||
namespace DiunaBI.Infrastructure.Migrations
|
||||
{
|
||||
/// <inheritdoc />
|
||||
public partial class RemoveQueueJobDuplicateUTCFields : Migration
|
||||
{
|
||||
/// <inheritdoc />
|
||||
protected override void Up(MigrationBuilder migrationBuilder)
|
||||
{
|
||||
migrationBuilder.DropColumn(
|
||||
name: "CreatedAtUtc",
|
||||
table: "QueueJobs");
|
||||
|
||||
migrationBuilder.DropColumn(
|
||||
name: "ModifiedAtUtc",
|
||||
table: "QueueJobs");
|
||||
|
||||
migrationBuilder.AddColumn<DateTime>(
|
||||
name: "ModifiedAt",
|
||||
table: "QueueJobs",
|
||||
type: "datetime2",
|
||||
nullable: false,
|
||||
defaultValueSql: "GETUTCDATE()");
|
||||
}
|
||||
|
||||
/// <inheritdoc />
|
||||
protected override void Down(MigrationBuilder migrationBuilder)
|
||||
{
|
||||
migrationBuilder.DropColumn(
|
||||
name: "ModifiedAt",
|
||||
table: "QueueJobs");
|
||||
|
||||
migrationBuilder.AddColumn<DateTime>(
|
||||
name: "CreatedAtUtc",
|
||||
table: "QueueJobs",
|
||||
type: "datetime2",
|
||||
nullable: false,
|
||||
defaultValue: new DateTime(1, 1, 1, 0, 0, 0, 0, DateTimeKind.Unspecified));
|
||||
|
||||
migrationBuilder.AddColumn<DateTime>(
|
||||
name: "ModifiedAtUtc",
|
||||
table: "QueueJobs",
|
||||
type: "datetime2",
|
||||
nullable: false,
|
||||
defaultValue: new DateTime(1, 1, 1, 0, 0, 0, 0, DateTimeKind.Unspecified));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -136,9 +136,6 @@ namespace DiunaBI.Infrastructure.Migrations
|
||||
.HasColumnType("datetime2")
|
||||
.HasDefaultValueSql("GETUTCDATE()");
|
||||
|
||||
b.Property<DateTime>("CreatedAtUtc")
|
||||
.HasColumnType("datetime2");
|
||||
|
||||
b.Property<Guid>("CreatedById")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
@@ -163,8 +160,10 @@ namespace DiunaBI.Infrastructure.Migrations
|
||||
b.Property<int>("MaxRetries")
|
||||
.HasColumnType("int");
|
||||
|
||||
b.Property<DateTime>("ModifiedAtUtc")
|
||||
.HasColumnType("datetime2");
|
||||
b.Property<DateTime>("ModifiedAt")
|
||||
.ValueGeneratedOnAdd()
|
||||
.HasColumnType("datetime2")
|
||||
.HasDefaultValueSql("GETUTCDATE()");
|
||||
|
||||
b.Property<Guid>("ModifiedById")
|
||||
.HasColumnType("uniqueidentifier");
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Interfaces;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Exporters;
|
||||
namespace DiunaBI.Infrastructure.Plugins;
|
||||
|
||||
public abstract class MorskaBaseExporter : IDataExporter
|
||||
public abstract class BaseDataExporter : IDataExporter
|
||||
{
|
||||
public abstract string ExporterType { get; }
|
||||
|
||||
public virtual bool CanExport(string exporterType) => ExporterType == exporterType;
|
||||
|
||||
public abstract void Export(Layer layer);
|
||||
}
|
||||
@@ -1,9 +1,9 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Interfaces;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Importers;
|
||||
namespace DiunaBI.Infrastructure.Plugins;
|
||||
|
||||
public abstract class MorskaBaseImporter : IDataImporter
|
||||
public abstract class BaseDataImporter : IDataImporter
|
||||
{
|
||||
public abstract string ImporterType { get; }
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Interfaces;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
namespace DiunaBI.Infrastructure.Plugins;
|
||||
|
||||
public abstract class MorskaBaseProcessor : IDataProcessor
|
||||
public abstract class BaseDataProcessor : IDataProcessor
|
||||
{
|
||||
public abstract string ProcessorType { get; }
|
||||
|
||||
234
DiunaBI.Infrastructure/Services/JobSchedulerService.cs
Normal file
234
DiunaBI.Infrastructure/Services/JobSchedulerService.cs
Normal file
@@ -0,0 +1,234 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Infrastructure.Services;
|
||||
|
||||
public class JobSchedulerService
|
||||
{
|
||||
private readonly AppDbContext _db;
|
||||
private readonly ILogger<JobSchedulerService> _logger;
|
||||
|
||||
public JobSchedulerService(AppDbContext db, ILogger<JobSchedulerService> logger)
|
||||
{
|
||||
_db = db;
|
||||
_logger = logger;
|
||||
}
|
||||
|
||||
public async Task<int> ScheduleImportJobsAsync(string? nameFilter = null)
|
||||
{
|
||||
_logger.LogInformation("JobScheduler: Starting import job scheduling with filter: {NameFilter}", nameFilter ?? "none");
|
||||
|
||||
var query = _db.Layers
|
||||
.Include(x => x.Records)
|
||||
.Where(x =>
|
||||
x.Records!.Any(r => r.Code == "Type" && r.Desc1 == "ImportWorker") &&
|
||||
x.Records!.Any(r => r.Code == "IsEnabled" && r.Desc1 == "True")
|
||||
);
|
||||
|
||||
if (!string.IsNullOrEmpty(nameFilter))
|
||||
{
|
||||
query = query.Where(x => x.Name != null && x.Name.Contains(nameFilter));
|
||||
}
|
||||
|
||||
var importWorkers = await query
|
||||
.OrderBy(x => x.CreatedAt)
|
||||
.AsNoTracking()
|
||||
.ToListAsync();
|
||||
|
||||
_logger.LogInformation("JobScheduler: Found {Count} import workers to schedule", importWorkers.Count);
|
||||
|
||||
var jobsCreated = 0;
|
||||
var scheduledLayerIds = new HashSet<Guid>(); // Track LayerIds scheduled in this batch
|
||||
|
||||
foreach (var worker in importWorkers)
|
||||
{
|
||||
try
|
||||
{
|
||||
var plugin = worker.Records?.FirstOrDefault(r => r.Code == "Plugin")?.Desc1;
|
||||
if (string.IsNullOrEmpty(plugin))
|
||||
{
|
||||
_logger.LogWarning("JobScheduler: Import worker {LayerName} ({LayerId}) has no Plugin configured, skipping",
|
||||
worker.Name, worker.Id);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Get priority from config (default: 50)
|
||||
var priorityStr = worker.Records?.FirstOrDefault(r => r.Code == "Priority")?.Desc1;
|
||||
var priority = int.TryParse(priorityStr, out var p) ? p : 50;
|
||||
|
||||
// Get max retries from config (default: 3)
|
||||
var maxRetriesStr = worker.Records?.FirstOrDefault(r => r.Code == "MaxRetries")?.Desc1;
|
||||
var maxRetries = int.TryParse(maxRetriesStr, out var mr) ? mr : 3;
|
||||
|
||||
// Check in-memory: already scheduled in this batch?
|
||||
if (scheduledLayerIds.Contains(worker.Id))
|
||||
{
|
||||
_logger.LogDebug("JobScheduler: Job already scheduled in this batch for {LayerName} ({LayerId})",
|
||||
worker.Name, worker.Id);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Check if there's already a pending/running job for this layer in database
|
||||
var existingJob = await _db.QueueJobs
|
||||
.Where(j => j.LayerId == worker.Id &&
|
||||
(j.Status == JobStatus.Pending || j.Status == JobStatus.Running))
|
||||
.FirstOrDefaultAsync();
|
||||
|
||||
if (existingJob != null)
|
||||
{
|
||||
_logger.LogDebug("JobScheduler: Job already exists for {LayerName} ({LayerId}), status: {Status}",
|
||||
worker.Name, worker.Id, existingJob.Status);
|
||||
continue;
|
||||
}
|
||||
|
||||
var job = new QueueJob
|
||||
{
|
||||
Id = Guid.NewGuid(),
|
||||
LayerId = worker.Id,
|
||||
LayerName = worker.Name ?? "Unknown",
|
||||
PluginName = plugin,
|
||||
JobType = JobType.Import,
|
||||
Priority = priority,
|
||||
MaxRetries = maxRetries,
|
||||
Status = JobStatus.Pending,
|
||||
CreatedAt = DateTime.UtcNow,
|
||||
ModifiedAt = DateTime.UtcNow,
|
||||
CreatedById = DiunaBI.Domain.Entities.User.AutoImportUserId,
|
||||
ModifiedById = DiunaBI.Domain.Entities.User.AutoImportUserId
|
||||
};
|
||||
|
||||
_db.QueueJobs.Add(job);
|
||||
scheduledLayerIds.Add(worker.Id); // Track that we've scheduled this layer
|
||||
jobsCreated++;
|
||||
|
||||
_logger.LogInformation("JobScheduler: Created import job for {LayerName} ({LayerId}) with priority {Priority}",
|
||||
worker.Name, worker.Id, priority);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "JobScheduler: Failed to create job for {LayerName} ({LayerId})",
|
||||
worker.Name, worker.Id);
|
||||
}
|
||||
}
|
||||
|
||||
if (jobsCreated > 0)
|
||||
{
|
||||
await _db.SaveChangesAsync();
|
||||
_logger.LogInformation("JobScheduler: Successfully created {Count} import jobs", jobsCreated);
|
||||
}
|
||||
|
||||
return jobsCreated;
|
||||
}
|
||||
|
||||
public async Task<int> ScheduleProcessJobsAsync()
|
||||
{
|
||||
_logger.LogInformation("JobScheduler: Starting process job scheduling");
|
||||
|
||||
var processWorkers = await _db.Layers
|
||||
.Include(x => x.Records)
|
||||
.Where(x =>
|
||||
x.Records!.Any(r => r.Code == "Type" && r.Desc1 == "ProcessWorker") &&
|
||||
x.Records!.Any(r => r.Code == "IsEnabled" && r.Desc1 == "True")
|
||||
)
|
||||
.OrderBy(x => x.CreatedAt)
|
||||
.AsNoTracking()
|
||||
.ToListAsync();
|
||||
|
||||
_logger.LogInformation("JobScheduler: Found {Count} process workers to schedule", processWorkers.Count);
|
||||
|
||||
var jobsCreated = 0;
|
||||
var scheduledLayerIds = new HashSet<Guid>(); // Track LayerIds scheduled in this batch
|
||||
|
||||
foreach (var worker in processWorkers)
|
||||
{
|
||||
try
|
||||
{
|
||||
var plugin = worker.Records?.FirstOrDefault(r => r.Code == "Plugin")?.Desc1;
|
||||
if (string.IsNullOrEmpty(plugin))
|
||||
{
|
||||
_logger.LogWarning("JobScheduler: Process worker {LayerName} ({LayerId}) has no Plugin configured, skipping",
|
||||
worker.Name, worker.Id);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Get priority from config (default: 100 for processes - higher than imports)
|
||||
var priorityStr = worker.Records?.FirstOrDefault(r => r.Code == "Priority")?.Desc1;
|
||||
var priority = int.TryParse(priorityStr, out var p) ? p : 100;
|
||||
|
||||
// Get max retries from config (default: 3)
|
||||
var maxRetriesStr = worker.Records?.FirstOrDefault(r => r.Code == "MaxRetries")?.Desc1;
|
||||
var maxRetries = int.TryParse(maxRetriesStr, out var mr) ? mr : 3;
|
||||
|
||||
// Check in-memory: already scheduled in this batch?
|
||||
if (scheduledLayerIds.Contains(worker.Id))
|
||||
{
|
||||
_logger.LogDebug("JobScheduler: Job already scheduled in this batch for {LayerName} ({LayerId})",
|
||||
worker.Name, worker.Id);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Check if there's already a pending/running job for this layer in database
|
||||
var existingJob = await _db.QueueJobs
|
||||
.Where(j => j.LayerId == worker.Id &&
|
||||
(j.Status == JobStatus.Pending || j.Status == JobStatus.Running))
|
||||
.FirstOrDefaultAsync();
|
||||
|
||||
if (existingJob != null)
|
||||
{
|
||||
_logger.LogDebug("JobScheduler: Job already exists for {LayerName} ({LayerId}), status: {Status}",
|
||||
worker.Name, worker.Id, existingJob.Status);
|
||||
continue;
|
||||
}
|
||||
|
||||
var job = new QueueJob
|
||||
{
|
||||
Id = Guid.NewGuid(),
|
||||
LayerId = worker.Id,
|
||||
LayerName = worker.Name ?? "Unknown",
|
||||
PluginName = plugin,
|
||||
JobType = JobType.Process,
|
||||
Priority = priority,
|
||||
MaxRetries = maxRetries,
|
||||
Status = JobStatus.Pending,
|
||||
CreatedAt = DateTime.UtcNow,
|
||||
ModifiedAt = DateTime.UtcNow,
|
||||
CreatedById = DiunaBI.Domain.Entities.User.AutoImportUserId,
|
||||
ModifiedById = DiunaBI.Domain.Entities.User.AutoImportUserId
|
||||
};
|
||||
|
||||
_db.QueueJobs.Add(job);
|
||||
scheduledLayerIds.Add(worker.Id); // Track that we've scheduled this layer
|
||||
jobsCreated++;
|
||||
|
||||
_logger.LogInformation("JobScheduler: Created process job for {LayerName} ({LayerId}) with priority {Priority}",
|
||||
worker.Name, worker.Id, priority);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "JobScheduler: Failed to create job for {LayerName} ({LayerId})",
|
||||
worker.Name, worker.Id);
|
||||
}
|
||||
}
|
||||
|
||||
if (jobsCreated > 0)
|
||||
{
|
||||
await _db.SaveChangesAsync();
|
||||
_logger.LogInformation("JobScheduler: Successfully created {Count} process jobs", jobsCreated);
|
||||
}
|
||||
|
||||
return jobsCreated;
|
||||
}
|
||||
|
||||
public async Task<int> ScheduleAllJobsAsync(string? nameFilter = null)
|
||||
{
|
||||
var importCount = await ScheduleImportJobsAsync(nameFilter);
|
||||
var processCount = await ScheduleProcessJobsAsync();
|
||||
|
||||
_logger.LogInformation("JobScheduler: Scheduled {ImportCount} import jobs and {ProcessCount} process jobs",
|
||||
importCount, processCount);
|
||||
|
||||
return importCount + processCount;
|
||||
}
|
||||
}
|
||||
202
DiunaBI.Infrastructure/Services/JobWorkerService.cs
Normal file
202
DiunaBI.Infrastructure/Services/JobWorkerService.cs
Normal file
@@ -0,0 +1,202 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Hosting;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Infrastructure.Services;
|
||||
|
||||
public class JobWorkerService : BackgroundService
|
||||
{
|
||||
private readonly IServiceProvider _serviceProvider;
|
||||
private readonly ILogger<JobWorkerService> _logger;
|
||||
private readonly TimeSpan _pollInterval = TimeSpan.FromSeconds(5);
|
||||
private readonly TimeSpan _rateLimitDelay = TimeSpan.FromSeconds(5);
|
||||
|
||||
public JobWorkerService(IServiceProvider serviceProvider, ILogger<JobWorkerService> logger)
|
||||
{
|
||||
_serviceProvider = serviceProvider;
|
||||
_logger = logger;
|
||||
}
|
||||
|
||||
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
|
||||
{
|
||||
_logger.LogInformation("JobWorker: Service started");
|
||||
|
||||
while (!stoppingToken.IsCancellationRequested)
|
||||
{
|
||||
try
|
||||
{
|
||||
await ProcessNextJobAsync(stoppingToken);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "JobWorker: Unexpected error in main loop");
|
||||
}
|
||||
|
||||
await Task.Delay(_pollInterval, stoppingToken);
|
||||
}
|
||||
|
||||
_logger.LogInformation("JobWorker: Service stopped");
|
||||
}
|
||||
|
||||
private async Task ProcessNextJobAsync(CancellationToken stoppingToken)
|
||||
{
|
||||
using var scope = _serviceProvider.CreateScope();
|
||||
var db = scope.ServiceProvider.GetRequiredService<AppDbContext>();
|
||||
var pluginManager = scope.ServiceProvider.GetRequiredService<PluginManager>();
|
||||
|
||||
// Get next pending job (ordered by priority, then creation time)
|
||||
var job = await db.QueueJobs
|
||||
.Where(j => j.Status == JobStatus.Pending || j.Status == JobStatus.Retrying)
|
||||
.OrderBy(j => j.Priority)
|
||||
.ThenBy(j => j.CreatedAt)
|
||||
.FirstOrDefaultAsync(stoppingToken);
|
||||
|
||||
if (job == null)
|
||||
{
|
||||
// No jobs to process
|
||||
return;
|
||||
}
|
||||
|
||||
_logger.LogInformation("JobWorker: Processing job {JobId} - {LayerName} ({JobType}) - Current RetryCount: {RetryCount}, MaxRetries: {MaxRetries}, Status: {Status}",
|
||||
job.Id, job.LayerName, job.JobType, job.RetryCount, job.MaxRetries, job.Status);
|
||||
|
||||
// Mark job as running
|
||||
job.Status = JobStatus.Running;
|
||||
job.LastAttemptAt = DateTime.UtcNow;
|
||||
job.ModifiedAt = DateTime.UtcNow;
|
||||
job.ModifiedById = DiunaBI.Domain.Entities.User.AutoImportUserId;
|
||||
await db.SaveChangesAsync(stoppingToken);
|
||||
|
||||
try
|
||||
{
|
||||
// Load the layer with its configuration
|
||||
var layer = await db.Layers
|
||||
.Include(l => l.Records)
|
||||
.AsNoTracking()
|
||||
.FirstOrDefaultAsync(l => l.Id == job.LayerId, stoppingToken);
|
||||
|
||||
if (layer == null)
|
||||
{
|
||||
throw new Exception($"Layer {job.LayerId} not found");
|
||||
}
|
||||
|
||||
// Execute the job based on type
|
||||
if (job.JobType == JobType.Import)
|
||||
{
|
||||
var importer = pluginManager.GetImporter(job.PluginName);
|
||||
if (importer == null)
|
||||
{
|
||||
throw new Exception($"Importer '{job.PluginName}' not found");
|
||||
}
|
||||
|
||||
_logger.LogInformation("JobWorker: Executing import for {LayerName} using {PluginName}",
|
||||
job.LayerName, job.PluginName);
|
||||
|
||||
importer.Import(layer);
|
||||
}
|
||||
else if (job.JobType == JobType.Process)
|
||||
{
|
||||
var processor = pluginManager.GetProcessor(job.PluginName);
|
||||
if (processor == null)
|
||||
{
|
||||
throw new Exception($"Processor '{job.PluginName}' not found");
|
||||
}
|
||||
|
||||
_logger.LogInformation("JobWorker: Executing process for {LayerName} using {PluginName}",
|
||||
job.LayerName, job.PluginName);
|
||||
|
||||
processor.Process(layer);
|
||||
}
|
||||
|
||||
// Job completed successfully
|
||||
job.Status = JobStatus.Completed;
|
||||
job.CompletedAt = DateTime.UtcNow;
|
||||
job.LastError = null;
|
||||
job.ModifiedAt = DateTime.UtcNow;
|
||||
job.ModifiedById = DiunaBI.Domain.Entities.User.AutoImportUserId;
|
||||
|
||||
_logger.LogInformation("JobWorker: Job {JobId} completed successfully", job.Id);
|
||||
|
||||
// Rate limiting delay (for Google Sheets API quota)
|
||||
if (job.JobType == JobType.Import)
|
||||
{
|
||||
_logger.LogDebug("JobWorker: Applying rate limit delay of {Delay} seconds", _rateLimitDelay.TotalSeconds);
|
||||
await Task.Delay(_rateLimitDelay, stoppingToken);
|
||||
}
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "JobWorker: Job {JobId} failed - {LayerName}", job.Id, job.LayerName);
|
||||
|
||||
// Capture full error details including inner exceptions
|
||||
job.LastError = GetFullErrorMessage(ex);
|
||||
job.ModifiedAt = DateTime.UtcNow;
|
||||
job.ModifiedById = DiunaBI.Domain.Entities.User.AutoImportUserId;
|
||||
|
||||
if (job.RetryCount >= job.MaxRetries)
|
||||
{
|
||||
job.Status = JobStatus.Failed;
|
||||
_logger.LogWarning("JobWorker: Job {JobId} marked as Failed - no more retries available (RetryCount: {RetryCount}, MaxRetries: {MaxRetries})",
|
||||
job.Id, job.RetryCount, job.MaxRetries);
|
||||
}
|
||||
else
|
||||
{
|
||||
job.Status = JobStatus.Pending;
|
||||
|
||||
// Exponential backoff: wait before retrying
|
||||
var backoffDelay = GetBackoffDelay(job.RetryCount + 1);
|
||||
|
||||
_logger.LogInformation("JobWorker: Job {JobId} will retry in {Delay} (retry {RetryNumber} of {MaxRetries})",
|
||||
job.Id, backoffDelay, job.RetryCount + 1, job.MaxRetries);
|
||||
|
||||
// Save current state with error message
|
||||
await db.SaveChangesAsync(stoppingToken);
|
||||
|
||||
// Wait before next attempt
|
||||
await Task.Delay(backoffDelay, stoppingToken);
|
||||
|
||||
// Increment retry count for next attempt
|
||||
job.RetryCount++;
|
||||
job.ModifiedAt = DateTime.UtcNow;
|
||||
job.ModifiedById = DiunaBI.Domain.Entities.User.AutoImportUserId;
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
await db.SaveChangesAsync(stoppingToken);
|
||||
}
|
||||
}
|
||||
|
||||
public override async Task StopAsync(CancellationToken stoppingToken)
|
||||
{
|
||||
_logger.LogInformation("JobWorker: Stopping service...");
|
||||
await base.StopAsync(stoppingToken);
|
||||
}
|
||||
|
||||
private static TimeSpan GetBackoffDelay(int retryCount)
|
||||
{
|
||||
return retryCount switch
|
||||
{
|
||||
1 => TimeSpan.FromSeconds(30), // 1st retry: 30 seconds
|
||||
2 => TimeSpan.FromMinutes(2), // 2nd retry: 2 minutes
|
||||
_ => TimeSpan.FromMinutes(5) // 3rd+ retry: 5 minutes
|
||||
};
|
||||
}
|
||||
|
||||
private static string GetFullErrorMessage(Exception ex)
|
||||
{
|
||||
var messages = new List<string>();
|
||||
var currentException = ex;
|
||||
|
||||
while (currentException != null)
|
||||
{
|
||||
messages.Add(currentException.Message);
|
||||
currentException = currentException.InnerException;
|
||||
}
|
||||
|
||||
return string.Join(" → ", messages);
|
||||
}
|
||||
}
|
||||
@@ -11,7 +11,7 @@ public class PluginManager
|
||||
private readonly IServiceProvider _serviceProvider;
|
||||
private readonly List<Type> _processorTypes = new();
|
||||
private readonly List<Type> _importerTypes = new();
|
||||
private readonly List<IDataExporter> _exporters = new();
|
||||
private readonly List<Type> _exporterTypes = new();
|
||||
private readonly List<IPlugin> _plugins = new();
|
||||
|
||||
public PluginManager(ILogger<PluginManager> logger, IServiceProvider serviceProvider)
|
||||
@@ -42,10 +42,11 @@ public class PluginManager
|
||||
}
|
||||
}
|
||||
|
||||
_logger.LogInformation("Loaded {ProcessorCount} processors and {ImporterCount} importers from {AssemblyCount} assemblies",
|
||||
_logger.LogInformation("Loaded {ProcessorCount} processors, {ImporterCount} importers, and {ExporterCount} exporters from {AssemblyCount} assemblies",
|
||||
_processorTypes.Count,
|
||||
_importerTypes.Count,
|
||||
dllFiles.Length); // Zmień z _plugins.Count na assemblyFiles.Length
|
||||
_exporterTypes.Count,
|
||||
dllFiles.Length);
|
||||
}
|
||||
|
||||
private void LoadPluginFromAssembly(string assemblyPath)
|
||||
@@ -70,6 +71,12 @@ public class PluginManager
|
||||
_importerTypes.Add(type);
|
||||
_logger.LogDebug("Registered importer: {Type}", type.Name); // Information -> Debug
|
||||
}
|
||||
|
||||
if (typeof(IDataExporter).IsAssignableFrom(type) && !type.IsInterface && !type.IsAbstract)
|
||||
{
|
||||
_exporterTypes.Add(type);
|
||||
_logger.LogDebug("Registered exporter: {Type}", type.Name);
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (Exception ex)
|
||||
@@ -84,14 +91,15 @@ public class PluginManager
|
||||
{
|
||||
try
|
||||
{
|
||||
using var scope = _serviceProvider.CreateScope();
|
||||
var scope = _serviceProvider.CreateScope();
|
||||
var instance = (IDataProcessor)ActivatorUtilities.CreateInstance(scope.ServiceProvider, type);
|
||||
|
||||
if (instance.CanProcess(processorType))
|
||||
{
|
||||
var scopedProvider = _serviceProvider.CreateScope().ServiceProvider;
|
||||
return (IDataProcessor)ActivatorUtilities.CreateInstance(scopedProvider, type);
|
||||
return instance;
|
||||
}
|
||||
|
||||
scope.Dispose();
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
@@ -107,14 +115,15 @@ public class PluginManager
|
||||
{
|
||||
try
|
||||
{
|
||||
using var scope = _serviceProvider.CreateScope();
|
||||
var scope = _serviceProvider.CreateScope();
|
||||
var instance = (IDataImporter)ActivatorUtilities.CreateInstance(scope.ServiceProvider, type);
|
||||
|
||||
if (instance.CanImport(importerType))
|
||||
{
|
||||
var scopedProvider = _serviceProvider.CreateScope().ServiceProvider;
|
||||
return (IDataImporter)ActivatorUtilities.CreateInstance(scopedProvider, type);
|
||||
return instance;
|
||||
}
|
||||
|
||||
scope.Dispose();
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
@@ -126,7 +135,27 @@ public class PluginManager
|
||||
|
||||
public IDataExporter? GetExporter(string exporterType)
|
||||
{
|
||||
return _exporters.FirstOrDefault(e => e.CanExport(exporterType));
|
||||
foreach (var type in _exporterTypes)
|
||||
{
|
||||
try
|
||||
{
|
||||
var scope = _serviceProvider.CreateScope();
|
||||
var instance = (IDataExporter)ActivatorUtilities.CreateInstance(scope.ServiceProvider, type);
|
||||
|
||||
if (instance.CanExport(exporterType))
|
||||
{
|
||||
return instance;
|
||||
}
|
||||
public int GetPluginsCount() => _processorTypes.Count + _importerTypes.Count + _exporters.Count;
|
||||
|
||||
scope.Dispose();
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "Failed to create exporter instance of type {Type}", type.Name);
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
public int GetPluginsCount() => _processorTypes.Count + _importerTypes.Count + _exporterTypes.Count;
|
||||
}
|
||||
@@ -1,5 +1,6 @@
|
||||
using System.Globalization;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Google.Apis.Sheets.v4;
|
||||
using Google.Apis.Sheets.v4.Data;
|
||||
@@ -7,7 +8,7 @@ using Microsoft.Extensions.Configuration;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Exporters;
|
||||
|
||||
public class GoogleSheetExport : MorskaBaseExporter
|
||||
public class GoogleSheetExport : BaseDataExporter
|
||||
{
|
||||
public override string ExporterType => "GoogleSheet";
|
||||
private readonly GoogleDriveHelper _googleDriveHelper;
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
using System.Globalization;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using Google.Apis.Sheets.v4;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
@@ -8,7 +9,7 @@ using Microsoft.EntityFrameworkCore;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Importers;
|
||||
|
||||
public class MorskaD1Importer : MorskaBaseImporter
|
||||
public class MorskaD1Importer : BaseDataImporter
|
||||
{
|
||||
public override string ImporterType => "Morska.Import.D1";
|
||||
|
||||
|
||||
@@ -3,6 +3,7 @@ using System.Text;
|
||||
using System.Text.Json;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
@@ -11,7 +12,7 @@ using Google.Apis.Sheets.v4.Data;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Importers;
|
||||
|
||||
public class MorskaD3Importer : MorskaBaseImporter
|
||||
public class MorskaD3Importer : BaseDataImporter
|
||||
{
|
||||
public override string ImporterType => "Morska.Import.D3";
|
||||
|
||||
|
||||
@@ -1,13 +1,14 @@
|
||||
using System.Globalization;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using Google.Apis.Sheets.v4;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Importers;
|
||||
|
||||
public class MorskaFk2Importer : MorskaBaseImporter
|
||||
public class MorskaFk2Importer : BaseDataImporter
|
||||
{
|
||||
public override string ImporterType => "Morska.Import.FK2";
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
using System.Globalization;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using Google.Apis.Sheets.v4;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
@@ -8,7 +9,7 @@ using Microsoft.EntityFrameworkCore;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Importers;
|
||||
|
||||
public class MorskaStandardImporter : MorskaBaseImporter
|
||||
public class MorskaStandardImporter : BaseDataImporter
|
||||
{
|
||||
public override string ImporterType => "Morska.Import.Standard";
|
||||
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
using System.Text.RegularExpressions;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services.Calculations;
|
||||
using Google.Apis.Sheets.v4;
|
||||
using Google.Apis.Sheets.v4.Data;
|
||||
@@ -10,7 +11,7 @@ using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaD6Processor : MorskaBaseProcessor
|
||||
public class MorskaD6Processor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "Morska.Process.D6";
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
using System.Globalization;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using DiunaBI.Infrastructure.Services.Calculations;
|
||||
using Google.Apis.Sheets.v4;
|
||||
@@ -10,7 +11,7 @@ using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT1R1Processor : MorskaBaseProcessor
|
||||
public class MorskaT1R1Processor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "Morska.Process.T1.R1";
|
||||
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
using System.Text.RegularExpressions;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Google.Apis.Sheets.v4;
|
||||
using Google.Apis.Sheets.v4.Data;
|
||||
@@ -10,7 +11,7 @@ using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT1R3Processor : MorskaBaseProcessor
|
||||
public class MorskaT1R3Processor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "Morska.Process.T1.R3";
|
||||
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT3MultiSourceCopySelectedCodesProcessor : MorskaBaseProcessor
|
||||
public class MorskaT3MultiSourceCopySelectedCodesProcessor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "T3.MultiSourceCopySelectedCodes";
|
||||
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT3MultiSourceCopySelectedCodesYearSummaryProcessor : MorskaBaseProcessor
|
||||
public class MorskaT3MultiSourceCopySelectedCodesYearSummaryProcessor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "T3.MultiSourceCopySelectedCodesYearSummary";
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using DiunaBI.Infrastructure.Services.Calculations;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
@@ -7,7 +8,7 @@ using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT3MultiSourceSummaryProcessor : MorskaBaseProcessor
|
||||
public class MorskaT3MultiSourceSummaryProcessor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "Morska.Process.T3.MultiSourceSummary";
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using DiunaBI.Infrastructure.Services.Calculations;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
@@ -7,7 +8,7 @@ using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT3MultiSourceYearSummaryProcessor : MorskaBaseProcessor
|
||||
public class MorskaT3MultiSourceYearSummaryProcessor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "Morska.Process.T3.MultiSourceYearSummary";
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.Extensions.Logging;
|
||||
@@ -7,7 +8,7 @@ using Google.Apis.Sheets.v4;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT3SingleSourceProcessor : MorskaBaseProcessor
|
||||
public class MorskaT3SingleSourceProcessor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "Morska.Process.T3.SingleSource";
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.Extensions.Logging;
|
||||
@@ -7,7 +8,7 @@ using Google.Apis.Sheets.v4;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT3SourceYearSummaryProcessor : MorskaBaseProcessor
|
||||
public class MorskaT3SourceYearSummaryProcessor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "Morska.Process.T3.SourceYearSummary";
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
using System.Globalization;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Google.Apis.Sheets.v4;
|
||||
using Google.Apis.Sheets.v4.Data;
|
||||
@@ -9,7 +10,7 @@ using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT4R2Processor : MorskaBaseProcessor
|
||||
public class MorskaT4R2Processor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "Morska.Process.T4.R2";
|
||||
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Google.Apis.Sheets.v4;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT4SingleSourceProcessor : MorskaBaseProcessor
|
||||
public class MorskaT4SingleSourceProcessor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "Morska.Process.T4.SingleSource";
|
||||
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using DiunaBI.Infrastructure.Services;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.Morska.Processors;
|
||||
|
||||
public class MorskaT5LastValuesProcessor : MorskaBaseProcessor
|
||||
public class MorskaT5LastValuesProcessor : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "Morska.Process.T5.LastValues";
|
||||
|
||||
|
||||
17
DiunaBI.Plugins.PedrolloPL/DiunaBI.Plugins.PedrolloPL.csproj
Normal file
17
DiunaBI.Plugins.PedrolloPL/DiunaBI.Plugins.PedrolloPL.csproj
Normal file
@@ -0,0 +1,17 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<Nullable>enable</Nullable>
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<PackageReference Include="Microsoft.Extensions.Logging" Version="10.0.0" />
|
||||
<PackageReference Include="Google.Apis.Sheets.v4" Version="1.68.0.3525" />
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\DiunaBI.Domain\DiunaBI.Domain.csproj" />
|
||||
<ProjectReference Include="..\DiunaBI.Infrastructure\DiunaBI.Infrastructure.csproj" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
392
DiunaBI.Plugins.PedrolloPL/Importers/PedrolloPLImportB3.cs
Normal file
392
DiunaBI.Plugins.PedrolloPL/Importers/PedrolloPLImportB3.cs
Normal file
@@ -0,0 +1,392 @@
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.PedrolloPL.Importers;
|
||||
|
||||
public class PedrolloPLImportB3 : BaseDataImporter
|
||||
{
|
||||
public override string ImporterType => "PedrolloPL.Import.B3";
|
||||
|
||||
private readonly AppDbContext _db;
|
||||
private readonly ILogger<PedrolloPLImportB3> _logger;
|
||||
|
||||
// Configuration properties
|
||||
private string? DataInboxName { get; set; }
|
||||
private string? DataInboxSource { get; set; }
|
||||
private string? StartDate { get; set; }
|
||||
private string? EndDate { get; set; }
|
||||
private string? ImportYear { get; set; }
|
||||
private bool IsEnabled { get; set; }
|
||||
|
||||
// Cached deserialized data
|
||||
private List<List<object>>? _cachedRawData;
|
||||
private DataInbox? _cachedDataInbox;
|
||||
private Dictionary<string, string>? _regionCodeMap;
|
||||
|
||||
public PedrolloPLImportB3(
|
||||
AppDbContext db,
|
||||
ILogger<PedrolloPLImportB3> logger)
|
||||
{
|
||||
_db = db;
|
||||
_logger = logger;
|
||||
}
|
||||
|
||||
public override void Import(Layer importWorker)
|
||||
{
|
||||
try
|
||||
{
|
||||
_logger.LogInformation("{ImporterType}: Starting import for {ImportWorkerName} ({ImportWorkerId})",
|
||||
ImporterType, importWorker.Name, importWorker.Id);
|
||||
|
||||
// Clear cache at start
|
||||
_cachedRawData = null;
|
||||
_cachedDataInbox = null;
|
||||
_regionCodeMap = null;
|
||||
|
||||
LoadConfiguration(importWorker);
|
||||
ValidateConfiguration();
|
||||
|
||||
if (!IsEnabled)
|
||||
{
|
||||
_logger.LogInformation("{ImporterType}: Import disabled for {ImportWorkerName}",
|
||||
ImporterType, importWorker.Name);
|
||||
return;
|
||||
}
|
||||
|
||||
// Find and deserialize DataInbox data
|
||||
FindAndDeserializeDataInbox();
|
||||
|
||||
// Load region code mapping from dictionary layer
|
||||
LoadRegionCodeMapping();
|
||||
|
||||
// Map data from DataInbox to Layer records
|
||||
var mappedRecords = MapDataToRecords();
|
||||
|
||||
// Create new Import layer
|
||||
var importLayer = CreateImportLayer(importWorker);
|
||||
|
||||
// Save records to database
|
||||
SaveRecordsToLayer(importLayer, mappedRecords);
|
||||
|
||||
_logger.LogInformation("{ImporterType}: Successfully completed import for {ImportWorkerName} - Created {RecordCount} records",
|
||||
ImporterType, importWorker.Name, mappedRecords.Count);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "{ImporterType}: Failed to import {ImportWorkerName} ({ImportWorkerId})",
|
||||
ImporterType, importWorker.Name, importWorker.Id);
|
||||
throw;
|
||||
}
|
||||
finally
|
||||
{
|
||||
// Clear cache after import
|
||||
_cachedRawData = null;
|
||||
_cachedDataInbox = null;
|
||||
_regionCodeMap = null;
|
||||
}
|
||||
}
|
||||
|
||||
private void LoadConfiguration(Layer importWorker)
|
||||
{
|
||||
if (importWorker.Records == null) return;
|
||||
|
||||
DataInboxName = GetRecordValue(importWorker.Records, "DataInboxName");
|
||||
DataInboxSource = GetRecordValue(importWorker.Records, "DataInboxSource");
|
||||
StartDate = GetRecordValue(importWorker.Records, "StartDate");
|
||||
EndDate = GetRecordValue(importWorker.Records, "EndDate");
|
||||
ImportYear = GetRecordValue(importWorker.Records, "ImportYear");
|
||||
IsEnabled = GetRecordValue(importWorker.Records, "IsEnabled") == "True";
|
||||
|
||||
_logger.LogDebug(
|
||||
"{ImporterType}: Configuration loaded - DataInboxName: {DataInboxName}, Source: {Source}, Year: {Year}, Period: {StartDate} to {EndDate}, Enabled: {IsEnabled}",
|
||||
ImporterType, DataInboxName, DataInboxSource, ImportYear, StartDate, EndDate, IsEnabled);
|
||||
}
|
||||
|
||||
private void ValidateConfiguration()
|
||||
{
|
||||
var errors = new List<string>();
|
||||
|
||||
if (string.IsNullOrEmpty(DataInboxName)) errors.Add("DataInboxName is required");
|
||||
if (string.IsNullOrEmpty(DataInboxSource)) errors.Add("DataInboxSource is required");
|
||||
if (string.IsNullOrEmpty(StartDate)) errors.Add("StartDate is required");
|
||||
if (string.IsNullOrEmpty(EndDate)) errors.Add("EndDate is required");
|
||||
|
||||
if (errors.Any())
|
||||
{
|
||||
throw new InvalidOperationException($"Configuration validation failed: {string.Join(", ", errors)}");
|
||||
}
|
||||
|
||||
_logger.LogDebug("{ImporterType}: Configuration validated successfully", ImporterType);
|
||||
}
|
||||
|
||||
private void FindAndDeserializeDataInbox()
|
||||
{
|
||||
_logger.LogDebug("{ImporterType}: Searching for DataInbox with Name='{DataInboxName}' and Source='{DataInboxSource}'",
|
||||
ImporterType, DataInboxName, DataInboxSource);
|
||||
|
||||
// Find DataInbox by Name and Source, order by CreatedAt descending to get the latest
|
||||
var dataInbox = _db.DataInbox
|
||||
.Where(x => x.Name == DataInboxName && x.Source == DataInboxSource)
|
||||
.OrderByDescending(x => x.CreatedAt)
|
||||
.FirstOrDefault();
|
||||
|
||||
if (dataInbox == null)
|
||||
{
|
||||
throw new InvalidOperationException(
|
||||
$"DataInbox not found with Name='{DataInboxName}' and Source='{DataInboxSource}'");
|
||||
}
|
||||
|
||||
_logger.LogInformation("{ImporterType}: Found DataInbox - Id: {DataInboxId}, Name: {Name}, Source: {Source}, CreatedAt: {CreatedAt}",
|
||||
ImporterType, dataInbox.Id, dataInbox.Name, dataInbox.Source, dataInbox.CreatedAt);
|
||||
|
||||
// Deserialize the data
|
||||
try
|
||||
{
|
||||
var data = Convert.FromBase64String(dataInbox.Data);
|
||||
var jsonString = Encoding.UTF8.GetString(data);
|
||||
|
||||
_logger.LogDebug("{ImporterType}: Decoded {DataSize} bytes from base64",
|
||||
ImporterType, data.Length);
|
||||
|
||||
// Deserialize as array of arrays: [["<nieznany>", 1183.15, ...], ["DOLNOŚLĄSKIE", ...]]
|
||||
var rawData = JsonSerializer.Deserialize<List<List<object>>>(jsonString);
|
||||
if (rawData == null || rawData.Count == 0)
|
||||
{
|
||||
throw new InvalidOperationException($"DataInbox.Data is empty for: {dataInbox.Name}");
|
||||
}
|
||||
|
||||
_logger.LogInformation("{ImporterType}: Successfully deserialized {RowCount} rows from DataInbox",
|
||||
ImporterType, rawData.Count);
|
||||
|
||||
// Log first few rows for debugging
|
||||
if (rawData.Count > 0)
|
||||
{
|
||||
var sampleSize = Math.Min(3, rawData.Count);
|
||||
_logger.LogDebug("{ImporterType}: Sample rows (first {SampleSize}):", ImporterType, sampleSize);
|
||||
for (int i = 0; i < sampleSize; i++)
|
||||
{
|
||||
var row = rawData[i];
|
||||
if (row.Count > 0)
|
||||
{
|
||||
var regionName = row[0]?.ToString() ?? "null";
|
||||
var valueCount = row.Count - 1;
|
||||
_logger.LogDebug(" [{Index}] Region: {Region}, Values: {ValueCount}",
|
||||
i, regionName, valueCount);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Cache the deserialized data
|
||||
_cachedRawData = rawData;
|
||||
_cachedDataInbox = dataInbox;
|
||||
}
|
||||
catch (FormatException e)
|
||||
{
|
||||
_logger.LogError(e, "{ImporterType}: Invalid base64 data in DataInbox {DataInboxId}",
|
||||
ImporterType, dataInbox.Id);
|
||||
throw new InvalidOperationException($"Invalid base64 data in DataInbox: {dataInbox.Name}", e);
|
||||
}
|
||||
catch (JsonException e)
|
||||
{
|
||||
_logger.LogError(e, "{ImporterType}: Invalid JSON data in DataInbox {DataInboxId}",
|
||||
ImporterType, dataInbox.Id);
|
||||
throw new InvalidOperationException($"Invalid JSON data in DataInbox: {dataInbox.Name}", e);
|
||||
}
|
||||
}
|
||||
|
||||
private void LoadRegionCodeMapping()
|
||||
{
|
||||
const string dictionaryLayerName = "L1-D-P2-CODES";
|
||||
|
||||
_logger.LogDebug("{ImporterType}: Loading region code mapping from dictionary layer '{DictionaryLayerName}'",
|
||||
ImporterType, dictionaryLayerName);
|
||||
|
||||
var dictionaryLayer = _db.Layers
|
||||
.Where(x => x.Name == dictionaryLayerName && x.Type == LayerType.Dictionary)
|
||||
.FirstOrDefault();
|
||||
|
||||
if (dictionaryLayer == null)
|
||||
{
|
||||
throw new InvalidOperationException($"Dictionary layer '{dictionaryLayerName}' not found");
|
||||
}
|
||||
|
||||
// Load records for the dictionary layer
|
||||
var records = _db.Records
|
||||
.Where(x => x.LayerId == dictionaryLayer.Id)
|
||||
.ToList();
|
||||
|
||||
// Build mapping: Desc1 (region name) -> Code
|
||||
_regionCodeMap = records.ToDictionary(
|
||||
r => r.Desc1 ?? string.Empty,
|
||||
r => r.Code ?? string.Empty,
|
||||
StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
_logger.LogInformation("{ImporterType}: Loaded {MappingCount} region code mappings",
|
||||
ImporterType, _regionCodeMap.Count);
|
||||
}
|
||||
|
||||
private List<Record> MapDataToRecords()
|
||||
{
|
||||
if (_cachedRawData == null)
|
||||
{
|
||||
throw new InvalidOperationException("Raw data not loaded. Call FindAndDeserializeDataInbox first.");
|
||||
}
|
||||
|
||||
if (_regionCodeMap == null)
|
||||
{
|
||||
throw new InvalidOperationException("Region code mapping not loaded. Call LoadRegionCodeMapping first.");
|
||||
}
|
||||
|
||||
var records = new List<Record>();
|
||||
var now = DateTime.UtcNow;
|
||||
|
||||
_logger.LogDebug("{ImporterType}: Starting data mapping for {RowCount} rows",
|
||||
ImporterType, _cachedRawData.Count);
|
||||
|
||||
foreach (var row in _cachedRawData)
|
||||
{
|
||||
if (row.Count < 13)
|
||||
{
|
||||
_logger.LogWarning("{ImporterType}: Skipping row with insufficient data - expected 13 elements, got {Count}",
|
||||
ImporterType, row.Count);
|
||||
continue;
|
||||
}
|
||||
|
||||
// First element is region name
|
||||
var regionName = row[0]?.ToString();
|
||||
if (string.IsNullOrEmpty(regionName))
|
||||
{
|
||||
_logger.LogWarning("{ImporterType}: Skipping row with empty region name", ImporterType);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Find region code from dictionary
|
||||
if (!_regionCodeMap.TryGetValue(regionName, out var regionCode))
|
||||
{
|
||||
_logger.LogWarning("{ImporterType}: Region code not found for '{RegionName}' - skipping",
|
||||
ImporterType, regionName);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Create 12 records (one per month)
|
||||
for (int month = 1; month <= 12; month++)
|
||||
{
|
||||
var valueIndex = month; // row[1] = January, row[2] = February, etc.
|
||||
var valueObj = row[valueIndex];
|
||||
|
||||
// Convert value to double
|
||||
double? value = null;
|
||||
if (valueObj != null)
|
||||
{
|
||||
// Handle JsonElement if deserialized from JSON
|
||||
if (valueObj is JsonElement jsonElement)
|
||||
{
|
||||
if (jsonElement.ValueKind == JsonValueKind.Number)
|
||||
{
|
||||
value = jsonElement.GetDouble();
|
||||
}
|
||||
}
|
||||
else if (valueObj is double d)
|
||||
{
|
||||
value = d;
|
||||
}
|
||||
else if (double.TryParse(valueObj.ToString(), out var parsed))
|
||||
{
|
||||
value = parsed;
|
||||
}
|
||||
}
|
||||
|
||||
// Create code: {regionCode}{month:00}
|
||||
var code = $"{regionCode}{month:00}";
|
||||
|
||||
var record = new Record
|
||||
{
|
||||
Id = Guid.NewGuid(),
|
||||
Code = code,
|
||||
Value1 = value,
|
||||
CreatedAt = now,
|
||||
ModifiedAt = now
|
||||
};
|
||||
|
||||
records.Add(record);
|
||||
}
|
||||
|
||||
_logger.LogDebug("{ImporterType}: Mapped region '{RegionName}' (code: {RegionCode}) to 12 records",
|
||||
ImporterType, regionName, regionCode);
|
||||
}
|
||||
|
||||
_logger.LogInformation("{ImporterType}: Successfully mapped {RecordCount} records from {RowCount} rows",
|
||||
ImporterType, records.Count, _cachedRawData.Count);
|
||||
|
||||
return records;
|
||||
}
|
||||
|
||||
private Layer CreateImportLayer(Layer importWorker)
|
||||
{
|
||||
var now = DateTime.UtcNow;
|
||||
|
||||
var importLayer = new Layer
|
||||
{
|
||||
Id = Guid.NewGuid(),
|
||||
Number = _db.Layers.Count() + 1,
|
||||
ParentId = importWorker.Id,
|
||||
Type = LayerType.Import,
|
||||
IsCancelled = false,
|
||||
CreatedAt = now,
|
||||
ModifiedAt = now,
|
||||
CreatedById = Guid.Parse("f392209e-123e-4651-a5a4-0b1d6cf9ff9d"), // System user
|
||||
ModifiedById = Guid.Parse("f392209e-123e-4651-a5a4-0b1d6cf9ff9d") // System user
|
||||
};
|
||||
|
||||
importLayer.Name = $"L{importLayer.Number}-I-B3-{ImportYear}-{now:yyyyMMddHHmm}";
|
||||
|
||||
_logger.LogDebug("{ImporterType}: Creating import layer '{LayerName}' (Number: {Number})",
|
||||
ImporterType, importLayer.Name, importLayer.Number);
|
||||
|
||||
_db.Layers.Add(importLayer);
|
||||
_db.SaveChanges();
|
||||
|
||||
_logger.LogInformation("{ImporterType}: Created import layer '{LayerName}' with Id: {LayerId}",
|
||||
ImporterType, importLayer.Name, importLayer.Id);
|
||||
|
||||
return importLayer;
|
||||
}
|
||||
|
||||
private void SaveRecordsToLayer(Layer importLayer, List<Record> records)
|
||||
{
|
||||
_logger.LogDebug("{ImporterType}: Saving {RecordCount} records to layer {LayerId}",
|
||||
ImporterType, records.Count, importLayer.Id);
|
||||
|
||||
// Delete any existing records for this layer (shouldn't be any, but just in case)
|
||||
var toDelete = _db.Records.Where(x => x.LayerId == importLayer.Id).ToList();
|
||||
if (toDelete.Count > 0)
|
||||
{
|
||||
_logger.LogWarning("{ImporterType}: Found {ExistingCount} existing records for layer {LayerId}, removing them",
|
||||
ImporterType, toDelete.Count, importLayer.Id);
|
||||
_db.Records.RemoveRange(toDelete);
|
||||
}
|
||||
|
||||
// Set all required properties for each record
|
||||
foreach (var record in records)
|
||||
{
|
||||
record.LayerId = importLayer.Id;
|
||||
record.CreatedById = Guid.Parse("f392209e-123e-4651-a5a4-0b1d6cf9ff9d"); // System user
|
||||
record.ModifiedById = Guid.Parse("f392209e-123e-4651-a5a4-0b1d6cf9ff9d"); // System user
|
||||
_db.Records.Add(record);
|
||||
}
|
||||
|
||||
_db.SaveChanges();
|
||||
|
||||
_logger.LogInformation("{ImporterType}: Successfully saved {RecordCount} records to layer '{LayerName}'",
|
||||
ImporterType, records.Count, importLayer.Name);
|
||||
}
|
||||
|
||||
private string? GetRecordValue(ICollection<Record> records, string code)
|
||||
{
|
||||
return records.FirstOrDefault(x => x.Code == code)?.Desc1;
|
||||
}
|
||||
}
|
||||
542
DiunaBI.Plugins.PedrolloPL/Processors/PedrolloPLProcessP2.cs
Normal file
542
DiunaBI.Plugins.PedrolloPL/Processors/PedrolloPLProcessP2.cs
Normal file
@@ -0,0 +1,542 @@
|
||||
using System.Text;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using DiunaBI.Infrastructure.Data;
|
||||
using DiunaBI.Infrastructure.Plugins;
|
||||
using Google.Apis.Sheets.v4;
|
||||
using Google.Apis.Sheets.v4.Data;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.Plugins.PedrolloPL.Processors;
|
||||
|
||||
public class PedrolloPLProcessP2 : BaseDataProcessor
|
||||
{
|
||||
public override string ProcessorType => "PedrolloPL.Process.P2";
|
||||
|
||||
private readonly AppDbContext _db;
|
||||
private readonly ILogger<PedrolloPLProcessP2> _logger;
|
||||
private readonly SpreadsheetsResource.ValuesResource _googleSheetValues;
|
||||
|
||||
// Configuration properties
|
||||
private string? Year { get; set; }
|
||||
private bool IsEnabled { get; set; }
|
||||
private string? GoogleSheetId { get; set; }
|
||||
private string? GoogleSheetTab { get; set; }
|
||||
private string? GoogleSheetRange { get; set; }
|
||||
|
||||
// Cached data
|
||||
private Layer? _sourceLayer;
|
||||
private Layer? _processedLayer;
|
||||
private Dictionary<string, string>? _codeToRegionMap;
|
||||
|
||||
public PedrolloPLProcessP2(
|
||||
AppDbContext db,
|
||||
ILogger<PedrolloPLProcessP2> logger,
|
||||
SpreadsheetsResource.ValuesResource googleSheetValues)
|
||||
{
|
||||
_db = db;
|
||||
_logger = logger;
|
||||
_googleSheetValues = googleSheetValues;
|
||||
}
|
||||
|
||||
public override void Process(Layer processWorker)
|
||||
{
|
||||
try
|
||||
{
|
||||
_logger.LogInformation("{ProcessorType}: Starting process for {ProcessWorkerName} ({ProcessWorkerId})",
|
||||
ProcessorType, processWorker.Name, processWorker.Id);
|
||||
|
||||
// Clear cache at start
|
||||
_sourceLayer = null;
|
||||
_processedLayer = null;
|
||||
_codeToRegionMap = null;
|
||||
|
||||
LoadConfiguration(processWorker);
|
||||
ValidateConfiguration();
|
||||
|
||||
if (!IsEnabled)
|
||||
{
|
||||
_logger.LogInformation("{ProcessorType}: Process disabled for {ProcessWorkerName}",
|
||||
ProcessorType, processWorker.Name);
|
||||
return;
|
||||
}
|
||||
|
||||
// Find latest B3 import layer for the configured year
|
||||
FindSourceLayer();
|
||||
|
||||
// Find or create processed layer
|
||||
FindOrCreateProcessedLayer(processWorker);
|
||||
|
||||
// Transform data from source to processed layer
|
||||
var transformedRecords = TransformData();
|
||||
|
||||
// Save records to processed layer
|
||||
SaveRecordsToLayer(_processedLayer!, transformedRecords);
|
||||
|
||||
// Export to Google Sheets if configured
|
||||
if (!string.IsNullOrEmpty(GoogleSheetId) && !string.IsNullOrEmpty(GoogleSheetTab) && !string.IsNullOrEmpty(GoogleSheetRange))
|
||||
{
|
||||
ExportToGoogleSheet();
|
||||
}
|
||||
else
|
||||
{
|
||||
_logger.LogInformation("{ProcessorType}: Google Sheet export skipped - configuration not provided",
|
||||
ProcessorType);
|
||||
}
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Successfully completed process for {ProcessWorkerName} - Processed {RecordCount} records",
|
||||
ProcessorType, processWorker.Name, transformedRecords.Count);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "{ProcessorType}: Failed to process {ProcessWorkerName} ({ProcessWorkerId})",
|
||||
ProcessorType, processWorker.Name, processWorker.Id);
|
||||
throw;
|
||||
}
|
||||
finally
|
||||
{
|
||||
// Clear cache after process
|
||||
_sourceLayer = null;
|
||||
_processedLayer = null;
|
||||
_codeToRegionMap = null;
|
||||
}
|
||||
}
|
||||
|
||||
private void LoadConfiguration(Layer processWorker)
|
||||
{
|
||||
if (processWorker.Records == null) return;
|
||||
|
||||
Year = GetRecordValue(processWorker.Records, "Year");
|
||||
IsEnabled = GetRecordValue(processWorker.Records, "IsEnabled") == "True";
|
||||
GoogleSheetId = GetRecordValue(processWorker.Records, "GoogleSheetId");
|
||||
GoogleSheetTab = GetRecordValue(processWorker.Records, "GoogleSheetTab");
|
||||
GoogleSheetRange = GetRecordValue(processWorker.Records, "GoogleSheetRange");
|
||||
|
||||
_logger.LogDebug(
|
||||
"{ProcessorType}: Configuration loaded - Year: {Year}, Enabled: {IsEnabled}, SheetId: {SheetId}, Tab: {Tab}, Range: {Range}",
|
||||
ProcessorType, Year, IsEnabled, GoogleSheetId, GoogleSheetTab, GoogleSheetRange);
|
||||
}
|
||||
|
||||
private void ValidateConfiguration()
|
||||
{
|
||||
var errors = new List<string>();
|
||||
|
||||
if (string.IsNullOrEmpty(Year)) errors.Add("Year is required");
|
||||
|
||||
if (errors.Any())
|
||||
{
|
||||
throw new InvalidOperationException($"Configuration validation failed: {string.Join(", ", errors)}");
|
||||
}
|
||||
|
||||
_logger.LogDebug("{ProcessorType}: Configuration validated successfully", ProcessorType);
|
||||
}
|
||||
|
||||
private void FindSourceLayer()
|
||||
{
|
||||
_logger.LogDebug("{ProcessorType}: Searching for latest B3 import layer for year {Year}",
|
||||
ProcessorType, Year);
|
||||
|
||||
// Find latest B3 import layer matching pattern: L*-I-B3-{Year}-*
|
||||
var layerNamePattern = $"-I-B3-{Year}-";
|
||||
|
||||
_sourceLayer = _db.Layers
|
||||
.Where(x => x.Name != null && x.Name.Contains(layerNamePattern) && x.Type == LayerType.Import)
|
||||
.OrderByDescending(x => x.CreatedAt)
|
||||
.FirstOrDefault();
|
||||
|
||||
if (_sourceLayer == null)
|
||||
{
|
||||
throw new InvalidOperationException(
|
||||
$"Source B3 import layer not found for year {Year} (pattern: *{layerNamePattern}*)");
|
||||
}
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Found source layer - Id: {LayerId}, Name: {LayerName}, CreatedAt: {CreatedAt}",
|
||||
ProcessorType, _sourceLayer.Id, _sourceLayer.Name, _sourceLayer.CreatedAt);
|
||||
}
|
||||
|
||||
private void FindOrCreateProcessedLayer(Layer processWorker)
|
||||
{
|
||||
_logger.LogDebug("{ProcessorType}: Looking for existing processed layer with ParentId={ParentId}",
|
||||
ProcessorType, processWorker.Id);
|
||||
|
||||
// Check if processed layer already exists with ParentId = ProcessWorker.Id
|
||||
_processedLayer = _db.Layers
|
||||
.Where(x => x.ParentId == processWorker.Id && x.Type == LayerType.Processed)
|
||||
.FirstOrDefault();
|
||||
|
||||
if (_processedLayer != null)
|
||||
{
|
||||
_logger.LogInformation("{ProcessorType}: Found existing processed layer - Id: {LayerId}, Name: {LayerName}",
|
||||
ProcessorType, _processedLayer.Id, _processedLayer.Name);
|
||||
}
|
||||
else
|
||||
{
|
||||
_logger.LogInformation("{ProcessorType}: No existing processed layer found, creating new one",
|
||||
ProcessorType);
|
||||
|
||||
_processedLayer = CreateProcessedLayer(processWorker);
|
||||
}
|
||||
}
|
||||
|
||||
private Layer CreateProcessedLayer(Layer processWorker)
|
||||
{
|
||||
var now = DateTime.UtcNow;
|
||||
|
||||
var processedLayer = new Layer
|
||||
{
|
||||
Id = Guid.NewGuid(),
|
||||
Number = _db.Layers.Count() + 1,
|
||||
ParentId = processWorker.Id,
|
||||
Type = LayerType.Processed,
|
||||
IsCancelled = false,
|
||||
CreatedAt = now,
|
||||
ModifiedAt = now,
|
||||
CreatedById = Guid.Parse("f392209e-123e-4651-a5a4-0b1d6cf9ff9d"), // System user
|
||||
ModifiedById = Guid.Parse("f392209e-123e-4651-a5a4-0b1d6cf9ff9d") // System user
|
||||
};
|
||||
|
||||
processedLayer.Name = $"L{processedLayer.Number}-P-P2-{Year}-{now:yyyyMMddHHmm}";
|
||||
|
||||
_logger.LogDebug("{ProcessorType}: Creating processed layer '{LayerName}' (Number: {Number})",
|
||||
ProcessorType, processedLayer.Name, processedLayer.Number);
|
||||
|
||||
_db.Layers.Add(processedLayer);
|
||||
_db.SaveChanges();
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Created processed layer '{LayerName}' with Id: {LayerId}",
|
||||
ProcessorType, processedLayer.Name, processedLayer.Id);
|
||||
|
||||
return processedLayer;
|
||||
}
|
||||
|
||||
private List<Record> TransformData()
|
||||
{
|
||||
if (_sourceLayer == null)
|
||||
{
|
||||
throw new InvalidOperationException("Source layer not loaded. Call FindSourceLayer first.");
|
||||
}
|
||||
|
||||
_logger.LogDebug("{ProcessorType}: Loading records from source layer {LayerId}",
|
||||
ProcessorType, _sourceLayer.Id);
|
||||
|
||||
// Load all records from source layer
|
||||
var sourceRecords = _db.Records
|
||||
.Where(x => x.LayerId == _sourceLayer.Id && !x.IsDeleted)
|
||||
.ToList();
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Loaded {RecordCount} records from source layer",
|
||||
ProcessorType, sourceRecords.Count);
|
||||
|
||||
// Group records by first 2 digits of Code (region code)
|
||||
var groupedByRegion = sourceRecords
|
||||
.Where(x => !string.IsNullOrEmpty(x.Code) && x.Code.Length >= 2)
|
||||
.GroupBy(x => x.Code!.Substring(0, 2))
|
||||
.ToList();
|
||||
|
||||
_logger.LogDebug("{ProcessorType}: Grouped into {GroupCount} regions",
|
||||
ProcessorType, groupedByRegion.Count);
|
||||
|
||||
var transformedRecords = new List<Record>();
|
||||
var now = DateTime.UtcNow;
|
||||
|
||||
foreach (var regionGroup in groupedByRegion)
|
||||
{
|
||||
var regionCode = regionGroup.Key;
|
||||
|
||||
// Create array for 12 months (initialize with 0)
|
||||
var monthValues = new double?[12];
|
||||
for (int i = 0; i < 12; i++)
|
||||
{
|
||||
monthValues[i] = 0;
|
||||
}
|
||||
|
||||
// Fill in values for each month
|
||||
foreach (var sourceRecord in regionGroup)
|
||||
{
|
||||
if (sourceRecord.Code!.Length >= 4)
|
||||
{
|
||||
// Extract month from last 2 digits of code (e.g., "0105" -> month 5)
|
||||
var monthStr = sourceRecord.Code.Substring(2, 2);
|
||||
if (int.TryParse(monthStr, out var month) && month >= 1 && month <= 12)
|
||||
{
|
||||
var monthIndex = month - 1; // Convert to 0-based index
|
||||
monthValues[monthIndex] = sourceRecord.Value1 ?? 0;
|
||||
|
||||
_logger.LogDebug("{ProcessorType}: Region {RegionCode}, Month {Month}: Value = {Value}",
|
||||
ProcessorType, regionCode, month, sourceRecord.Value1);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Create transformed record with Code = region code and Value1-12 = monthly values
|
||||
var record = new Record
|
||||
{
|
||||
Id = Guid.NewGuid(),
|
||||
Code = regionCode,
|
||||
Value1 = monthValues[0],
|
||||
Value2 = monthValues[1],
|
||||
Value3 = monthValues[2],
|
||||
Value4 = monthValues[3],
|
||||
Value5 = monthValues[4],
|
||||
Value6 = monthValues[5],
|
||||
Value7 = monthValues[6],
|
||||
Value8 = monthValues[7],
|
||||
Value9 = monthValues[8],
|
||||
Value10 = monthValues[9],
|
||||
Value11 = monthValues[10],
|
||||
Value12 = monthValues[11],
|
||||
CreatedAt = now,
|
||||
ModifiedAt = now
|
||||
};
|
||||
|
||||
transformedRecords.Add(record);
|
||||
|
||||
_logger.LogDebug("{ProcessorType}: Transformed region '{RegionCode}' - Values: [{Values}]",
|
||||
ProcessorType, regionCode,
|
||||
string.Join(", ", monthValues.Select(v => v?.ToString() ?? "0")));
|
||||
}
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Successfully transformed {RecordCount} records from {SourceCount} source records",
|
||||
ProcessorType, transformedRecords.Count, sourceRecords.Count);
|
||||
|
||||
return transformedRecords;
|
||||
}
|
||||
|
||||
private void SaveRecordsToLayer(Layer processedLayer, List<Record> records)
|
||||
{
|
||||
_logger.LogDebug("{ProcessorType}: Saving {RecordCount} records to layer {LayerId}",
|
||||
ProcessorType, records.Count, processedLayer.Id);
|
||||
|
||||
// Delete any existing records for this layer
|
||||
var toDelete = _db.Records.Where(x => x.LayerId == processedLayer.Id).ToList();
|
||||
if (toDelete.Count > 0)
|
||||
{
|
||||
_logger.LogInformation("{ProcessorType}: Found {ExistingCount} existing records for layer {LayerId}, removing them",
|
||||
ProcessorType, toDelete.Count, processedLayer.Id);
|
||||
_db.Records.RemoveRange(toDelete);
|
||||
}
|
||||
|
||||
// Set all required properties for each record
|
||||
foreach (var record in records)
|
||||
{
|
||||
record.LayerId = processedLayer.Id;
|
||||
record.CreatedById = Guid.Parse("f392209e-123e-4651-a5a4-0b1d6cf9ff9d"); // System user
|
||||
record.ModifiedById = Guid.Parse("f392209e-123e-4651-a5a4-0b1d6cf9ff9d"); // System user
|
||||
_db.Records.Add(record);
|
||||
}
|
||||
|
||||
_db.SaveChanges();
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Successfully saved {RecordCount} records to layer '{LayerName}'",
|
||||
ProcessorType, records.Count, processedLayer.Name);
|
||||
}
|
||||
|
||||
private void ExportToGoogleSheet()
|
||||
{
|
||||
try
|
||||
{
|
||||
_logger.LogInformation("{ProcessorType}: Starting Google Sheet export to {SheetId}, Tab: {Tab}, Range: {Range}",
|
||||
ProcessorType, GoogleSheetId, GoogleSheetTab, GoogleSheetRange);
|
||||
|
||||
// Load dictionary for code to region name translation
|
||||
LoadCodeToRegionDictionary();
|
||||
|
||||
// Download current sheet data
|
||||
var sheetData = DownloadSheetData();
|
||||
|
||||
// Update sheet data with processed layer values
|
||||
var updatedData = UpdateSheetDataWithProcessedValues(sheetData);
|
||||
|
||||
// Upload updated data back to sheet
|
||||
UploadSheetData(updatedData);
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Successfully exported data to Google Sheet",
|
||||
ProcessorType);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "{ProcessorType}: Failed to export to Google Sheet",
|
||||
ProcessorType);
|
||||
throw;
|
||||
}
|
||||
}
|
||||
|
||||
private void LoadCodeToRegionDictionary()
|
||||
{
|
||||
const string dictionaryLayerName = "L1-D-P2-CODES";
|
||||
|
||||
_logger.LogDebug("{ProcessorType}: Loading code to region mapping from dictionary layer '{DictionaryLayerName}'",
|
||||
ProcessorType, dictionaryLayerName);
|
||||
|
||||
var dictionaryLayer = _db.Layers
|
||||
.Where(x => x.Name == dictionaryLayerName && x.Type == LayerType.Dictionary)
|
||||
.FirstOrDefault();
|
||||
|
||||
if (dictionaryLayer == null)
|
||||
{
|
||||
throw new InvalidOperationException($"Dictionary layer '{dictionaryLayerName}' not found");
|
||||
}
|
||||
|
||||
// Load records for the dictionary layer
|
||||
var records = _db.Records
|
||||
.Where(x => x.LayerId == dictionaryLayer.Id)
|
||||
.ToList();
|
||||
|
||||
// Build mapping: Code -> Desc1 (region name)
|
||||
_codeToRegionMap = records.ToDictionary(
|
||||
r => r.Code ?? string.Empty,
|
||||
r => r.Desc1 ?? string.Empty,
|
||||
StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Loaded {MappingCount} code to region mappings",
|
||||
ProcessorType, _codeToRegionMap.Count);
|
||||
}
|
||||
|
||||
private IList<IList<object>> DownloadSheetData()
|
||||
{
|
||||
_logger.LogDebug("{ProcessorType}: Downloading sheet data from range {Range}",
|
||||
ProcessorType, $"{GoogleSheetTab}!{GoogleSheetRange}");
|
||||
|
||||
var range = $"{GoogleSheetTab}!{GoogleSheetRange}";
|
||||
ValueRange? response;
|
||||
|
||||
try
|
||||
{
|
||||
response = _googleSheetValues.Get(GoogleSheetId, range).Execute();
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "{ProcessorType}: Failed to download sheet data from {Range}",
|
||||
ProcessorType, range);
|
||||
throw new InvalidOperationException($"Failed to download sheet data from {range}", e);
|
||||
}
|
||||
|
||||
if (response?.Values == null || response.Values.Count == 0)
|
||||
{
|
||||
throw new InvalidOperationException($"No data found in sheet range {range}");
|
||||
}
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Downloaded {RowCount} rows from Google Sheet",
|
||||
ProcessorType, response.Values.Count);
|
||||
|
||||
return response.Values;
|
||||
}
|
||||
|
||||
private IList<IList<object>> UpdateSheetDataWithProcessedValues(IList<IList<object>> sheetData)
|
||||
{
|
||||
if (_processedLayer == null)
|
||||
{
|
||||
throw new InvalidOperationException("Processed layer not loaded");
|
||||
}
|
||||
|
||||
if (_codeToRegionMap == null)
|
||||
{
|
||||
throw new InvalidOperationException("Code to region mapping not loaded");
|
||||
}
|
||||
|
||||
_logger.LogDebug("{ProcessorType}: Updating sheet data with processed values from layer {LayerId}",
|
||||
ProcessorType, _processedLayer.Id);
|
||||
|
||||
// Load all records from processed layer
|
||||
var processedRecords = _db.Records
|
||||
.Where(x => x.LayerId == _processedLayer.Id && !x.IsDeleted)
|
||||
.ToList();
|
||||
|
||||
_logger.LogDebug("{ProcessorType}: Loaded {RecordCount} records from processed layer",
|
||||
ProcessorType, processedRecords.Count);
|
||||
|
||||
var updatedRowCount = 0;
|
||||
|
||||
// Iterate through sheet data and update matching rows
|
||||
foreach (var row in sheetData)
|
||||
{
|
||||
if (row.Count == 0) continue;
|
||||
|
||||
// First column (index 0) contains the region name (Kontrola column)
|
||||
var regionName = row[0]?.ToString()?.Trim();
|
||||
if (string.IsNullOrEmpty(regionName)) continue;
|
||||
|
||||
// Find the code for this region name
|
||||
var regionCode = _codeToRegionMap
|
||||
.FirstOrDefault(x => x.Value.Equals(regionName, StringComparison.OrdinalIgnoreCase))
|
||||
.Key;
|
||||
|
||||
if (string.IsNullOrEmpty(regionCode))
|
||||
{
|
||||
_logger.LogWarning("{ProcessorType}: No code found for region '{RegionName}' in dictionary - skipping",
|
||||
ProcessorType, regionName);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Find the processed record for this code
|
||||
var processedRecord = processedRecords.FirstOrDefault(x => x.Code == regionCode);
|
||||
if (processedRecord == null)
|
||||
{
|
||||
_logger.LogWarning("{ProcessorType}: No processed record found for code '{RegionCode}' (region: '{RegionName}') - skipping",
|
||||
ProcessorType, regionCode, regionName);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Update columns 1-12 (monthly values) in the row
|
||||
// Column 0 is Kontrola (region name), columns 1-12 are monthly values
|
||||
// Ensure row has enough columns (13 total: 1 for region + 12 for months)
|
||||
while (row.Count < 13)
|
||||
{
|
||||
row.Add("");
|
||||
}
|
||||
|
||||
// Update monthly values (Value1 through Value12)
|
||||
row[1] = processedRecord.Value1 ?? 0;
|
||||
row[2] = processedRecord.Value2 ?? 0;
|
||||
row[3] = processedRecord.Value3 ?? 0;
|
||||
row[4] = processedRecord.Value4 ?? 0;
|
||||
row[5] = processedRecord.Value5 ?? 0;
|
||||
row[6] = processedRecord.Value6 ?? 0;
|
||||
row[7] = processedRecord.Value7 ?? 0;
|
||||
row[8] = processedRecord.Value8 ?? 0;
|
||||
row[9] = processedRecord.Value9 ?? 0;
|
||||
row[10] = processedRecord.Value10 ?? 0;
|
||||
row[11] = processedRecord.Value11 ?? 0;
|
||||
row[12] = processedRecord.Value12 ?? 0;
|
||||
|
||||
updatedRowCount++;
|
||||
|
||||
_logger.LogDebug("{ProcessorType}: Updated row for region '{RegionName}' (code: {RegionCode}) with 12 monthly values",
|
||||
ProcessorType, regionName, regionCode);
|
||||
}
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Updated {UpdatedRowCount} rows with processed data",
|
||||
ProcessorType, updatedRowCount);
|
||||
|
||||
return sheetData;
|
||||
}
|
||||
|
||||
private void UploadSheetData(IList<IList<object>> data)
|
||||
{
|
||||
_logger.LogDebug("{ProcessorType}: Uploading {RowCount} rows to Google Sheet range {Range}",
|
||||
ProcessorType, data.Count, $"{GoogleSheetTab}!{GoogleSheetRange}");
|
||||
|
||||
var range = $"{GoogleSheetTab}!{GoogleSheetRange}";
|
||||
var valueRange = new ValueRange { Values = data };
|
||||
|
||||
try
|
||||
{
|
||||
var updateRequest = _googleSheetValues.Update(valueRange, GoogleSheetId, range);
|
||||
updateRequest.ValueInputOption = SpreadsheetsResource.ValuesResource.UpdateRequest.ValueInputOptionEnum.USERENTERED;
|
||||
var response = updateRequest.Execute();
|
||||
|
||||
_logger.LogInformation("{ProcessorType}: Successfully uploaded data to Google Sheet - Updated {UpdatedCells} cells",
|
||||
ProcessorType, response.UpdatedCells);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
_logger.LogError(e, "{ProcessorType}: Failed to upload data to Google Sheet range {Range}",
|
||||
ProcessorType, range);
|
||||
throw new InvalidOperationException($"Failed to upload data to Google Sheet range {range}", e);
|
||||
}
|
||||
}
|
||||
|
||||
private string? GetRecordValue(ICollection<Record> records, string code)
|
||||
{
|
||||
return records.FirstOrDefault(x => x.Code == code)?.Desc1;
|
||||
}
|
||||
}
|
||||
@@ -37,15 +37,36 @@
|
||||
@_errorMessage
|
||||
</MudAlert>
|
||||
}
|
||||
|
||||
@if (_sessionExpired)
|
||||
{
|
||||
<MudAlert Severity="Severity.Warning" Class="mt-4" Dense="true">
|
||||
Your session has expired. Please sign in again.
|
||||
</MudAlert>
|
||||
}
|
||||
</MudCardContent>
|
||||
</MudCard>
|
||||
|
||||
@code {
|
||||
private bool _isLoading = false;
|
||||
private string _errorMessage = string.Empty;
|
||||
private bool _sessionExpired = false;
|
||||
private static LoginCard? _instance;
|
||||
private bool _isInitialized = false;
|
||||
|
||||
protected override void OnInitialized()
|
||||
{
|
||||
// Check if sessionExpired query parameter is present
|
||||
var uri = new Uri(NavigationManager.Uri);
|
||||
var query = System.Web.HttpUtility.ParseQueryString(uri.Query);
|
||||
_sessionExpired = query["sessionExpired"] == "true";
|
||||
|
||||
if (_sessionExpired)
|
||||
{
|
||||
Console.WriteLine("⚠️ Session expired - user redirected to login");
|
||||
}
|
||||
}
|
||||
|
||||
protected override async Task OnAfterRenderAsync(bool firstRender)
|
||||
{
|
||||
if (firstRender)
|
||||
@@ -1,43 +1,44 @@
|
||||
@using MudBlazor
|
||||
@using DiunaBI.UI.Shared.Services
|
||||
@inject AppConfig AppConfig
|
||||
@inject EntityChangeHubService HubService
|
||||
@inject AuthService AuthService
|
||||
@inherits LayoutComponentBase
|
||||
@implements IDisposable
|
||||
|
||||
<AuthGuard>
|
||||
<MudThemeProvider Theme="_theme"/>
|
||||
<MudPopoverProvider/>
|
||||
<MudDialogProvider/>
|
||||
<MudSnackbarProvider/>
|
||||
<MudThemeProvider Theme="_theme" />
|
||||
<MudPopoverProvider />
|
||||
<MudDialogProvider />
|
||||
<MudSnackbarProvider />
|
||||
|
||||
<MudLayout>
|
||||
<MudBreakpointProvider OnBreakpointChanged="OnBreakpointChanged"></MudBreakpointProvider>
|
||||
<MudAppBar Elevation="0">
|
||||
<MudIconButton
|
||||
Icon="@Icons.Material.Filled.Menu"
|
||||
Color="Color.Inherit"
|
||||
Edge="Edge.Start"
|
||||
OnClick="ToggleDrawer"
|
||||
Class="mud-hidden-md-up"/>
|
||||
<MudSpacer/>
|
||||
<MudIconButton Icon="@Icons.Material.Filled.Menu" Color="Color.Inherit" Edge="Edge.Start"
|
||||
OnClick="ToggleDrawer" Class="mud-hidden-md-up" />
|
||||
<MudSpacer />
|
||||
<MudText Typo="Typo.h6">@AppConfig.AppName</MudText>
|
||||
</MudAppBar>
|
||||
|
||||
<MudDrawer @bind-Open="_drawerOpen"
|
||||
Anchor="Anchor.Start"
|
||||
Variant="@_drawerVariant"
|
||||
Elevation="1"
|
||||
ClipMode="DrawerClipMode.Always"
|
||||
Class="mud-width-250">
|
||||
<MudDrawer @bind-Open="_drawerOpen" Anchor="Anchor.Start" Variant="@_drawerVariant" Elevation="1"
|
||||
ClipMode="DrawerClipMode.Always" Class="mud-width-250">
|
||||
<div class="nav-logo" style="text-align: center; padding: 20px;">
|
||||
<a href="https://www.diunabi.com" target="_blank">
|
||||
<img src="_content/DiunaBI.UI.Shared/images/logo.png" alt="DiunaBI" style="max-width: 180px; height: auto;" />
|
||||
<img src="_content/DiunaBI.UI.Shared/images/logo.png" alt="DiunaBI"
|
||||
style="max-width: 180px; height: auto;" />
|
||||
</a>
|
||||
</div>
|
||||
<MudNavMenu>
|
||||
<MudNavLink Href="/dashboard" Icon="@Icons.Material.Filled.Dashboard">Dashboard</MudNavLink>
|
||||
<MudNavLink Href="/layers" Icon="@Icons.Material.Filled.Inventory">Layers</MudNavLink>
|
||||
<MudNavLink Href="/datainbox" Icon="@Icons.Material.Filled.Inbox">Data Inbox</MudNavLink>
|
||||
<MudNavLink Href="/jobs" Icon="@Icons.Material.Filled.WorkHistory">Jobs</MudNavLink>
|
||||
</MudNavMenu>
|
||||
<div class="nav-logo" style="text-align: center; padding: 20px;">
|
||||
<img src="_content/DiunaBI.UI.Shared/images/clients/@AppConfig.ClientLogo" alt="DiunaBI"
|
||||
style="max-width: 180px; height: auto;" />
|
||||
</div>
|
||||
</MudDrawer>
|
||||
|
||||
<MudMainContent>
|
||||
@@ -53,6 +54,32 @@
|
||||
private bool _drawerOpen = true;
|
||||
private DrawerVariant _drawerVariant = DrawerVariant.Persistent;
|
||||
|
||||
protected override void OnInitialized()
|
||||
{
|
||||
// Subscribe to authentication state changes
|
||||
AuthService.AuthenticationStateChanged += OnAuthenticationStateChanged;
|
||||
|
||||
// If already authenticated (e.g., from restored session), initialize SignalR
|
||||
if (AuthService.IsAuthenticated)
|
||||
{
|
||||
_ = HubService.InitializeAsync();
|
||||
}
|
||||
}
|
||||
|
||||
private async void OnAuthenticationStateChanged(bool isAuthenticated)
|
||||
{
|
||||
if (isAuthenticated)
|
||||
{
|
||||
Console.WriteLine("🔐 MainLayout: User authenticated, initializing SignalR...");
|
||||
await HubService.InitializeAsync();
|
||||
}
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
AuthService.AuthenticationStateChanged -= OnAuthenticationStateChanged;
|
||||
}
|
||||
|
||||
private MudTheme _theme = new MudTheme()
|
||||
{
|
||||
PaletteLight = new PaletteLight()
|
||||
@@ -17,6 +17,7 @@
|
||||
<PackageReference Include="Microsoft.AspNetCore.WebUtilities" Version="10.0.0" />
|
||||
<PackageReference Include="Microsoft.Extensions.Http" Version="10.0.0" />
|
||||
<PackageReference Include="Microsoft.Extensions.Configuration.Abstractions" Version="10.0.0" />
|
||||
<PackageReference Include="Microsoft.AspNetCore.SignalR.Client" Version="10.0.0" />
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
using Microsoft.AspNetCore.Components;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using DiunaBI.UI.Shared.Services;
|
||||
using DiunaBI.UI.Shared.Handlers;
|
||||
|
||||
@@ -15,14 +17,16 @@ public static class ServiceCollectionExtensions
|
||||
Console.WriteLine($"🔧 Configuring HttpClient with BaseAddress: {baseUri}");
|
||||
|
||||
services.AddTransient<HttpLoggingHandler>();
|
||||
services.AddTransient<UnauthorizedResponseHandler>();
|
||||
|
||||
// Configure named HttpClient with logging handler
|
||||
// Configure named HttpClient with logging and 401 handling
|
||||
// Note: Authentication is handled by AuthService setting DefaultRequestHeaders.Authorization
|
||||
services.AddHttpClient("DiunaBI", client =>
|
||||
{
|
||||
client.BaseAddress = new Uri(baseUri);
|
||||
Console.WriteLine($"✅ HttpClient BaseAddress set to: {client.BaseAddress}");
|
||||
})
|
||||
.AddHttpMessageHandler<UnauthorizedResponseHandler>()
|
||||
.AddHttpMessageHandler<HttpLoggingHandler>();
|
||||
|
||||
// Register a scoped HttpClient factory that services will use
|
||||
@@ -35,14 +39,25 @@ public static class ServiceCollectionExtensions
|
||||
});
|
||||
|
||||
// Services
|
||||
services.AddScoped<TokenProvider>();
|
||||
services.AddScoped<AuthService>();
|
||||
services.AddScoped<LayerService>();
|
||||
services.AddScoped<DataInboxService>();
|
||||
services.AddScoped<JobService>();
|
||||
services.AddScoped<DateTimeHelper>();
|
||||
|
||||
// Filter state services (scoped to maintain state during user session)
|
||||
services.AddScoped<LayerFilterStateService>();
|
||||
services.AddScoped<DataInboxFilterStateService>();
|
||||
|
||||
// SignalR Hub Service (scoped per user session for authenticated connections)
|
||||
services.AddScoped(sp =>
|
||||
{
|
||||
var logger = sp.GetRequiredService<ILogger<EntityChangeHubService>>();
|
||||
var tokenProvider = sp.GetRequiredService<TokenProvider>();
|
||||
return new EntityChangeHubService(apiBaseUrl, sp, logger, tokenProvider);
|
||||
});
|
||||
|
||||
return services;
|
||||
}
|
||||
}
|
||||
41
DiunaBI.UI.Shared/Handlers/UnauthorizedResponseHandler.cs
Normal file
41
DiunaBI.UI.Shared/Handlers/UnauthorizedResponseHandler.cs
Normal file
@@ -0,0 +1,41 @@
|
||||
using Microsoft.AspNetCore.Components;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using DiunaBI.UI.Shared.Services;
|
||||
|
||||
namespace DiunaBI.UI.Shared.Handlers;
|
||||
|
||||
public class UnauthorizedResponseHandler : DelegatingHandler
|
||||
{
|
||||
private readonly IServiceProvider _serviceProvider;
|
||||
|
||||
public UnauthorizedResponseHandler(IServiceProvider serviceProvider)
|
||||
{
|
||||
_serviceProvider = serviceProvider;
|
||||
}
|
||||
|
||||
protected override async Task<HttpResponseMessage> SendAsync(
|
||||
HttpRequestMessage request,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var response = await base.SendAsync(request, cancellationToken);
|
||||
|
||||
// Check if response is 401 Unauthorized
|
||||
if (response.StatusCode == System.Net.HttpStatusCode.Unauthorized)
|
||||
{
|
||||
Console.WriteLine("⚠️ 401 Unauthorized response detected - clearing credentials and redirecting to login");
|
||||
|
||||
// Create a scope to get scoped services
|
||||
using var scope = _serviceProvider.CreateScope();
|
||||
var authService = scope.ServiceProvider.GetRequiredService<AuthService>();
|
||||
var navigationManager = scope.ServiceProvider.GetRequiredService<NavigationManager>();
|
||||
|
||||
// Clear authentication
|
||||
await authService.ClearAuthenticationAsync();
|
||||
|
||||
// Navigate to login page with session expired message
|
||||
navigationManager.NavigateTo("/login?sessionExpired=true", forceLoad: true);
|
||||
}
|
||||
|
||||
return response;
|
||||
}
|
||||
}
|
||||
@@ -2,8 +2,6 @@
|
||||
@using DiunaBI.UI.Shared.Services
|
||||
@using DiunaBI.Application.DTOModels
|
||||
@using MudBlazor
|
||||
@inject DataInboxService DataInboxService
|
||||
@inject NavigationManager NavigationManager
|
||||
|
||||
<MudCard>
|
||||
<MudCardHeader>
|
||||
@@ -1,15 +1,22 @@
|
||||
using DiunaBI.Application.DTOModels;
|
||||
using DiunaBI.UI.Shared.Services;
|
||||
using Microsoft.AspNetCore.Components;
|
||||
using MudBlazor;
|
||||
using System.Text;
|
||||
|
||||
namespace DiunaBI.UI.Shared.Pages;
|
||||
namespace DiunaBI.UI.Shared.Pages.DataInbox;
|
||||
|
||||
public partial class DataInboxDetailPage : ComponentBase
|
||||
public partial class Details : ComponentBase
|
||||
{
|
||||
[Parameter]
|
||||
public Guid Id { get; set; }
|
||||
|
||||
[Inject]
|
||||
private DataInboxService DataInboxService { get; set; } = null!;
|
||||
|
||||
[Inject]
|
||||
private NavigationManager NavigationManager { get; set; } = null!;
|
||||
|
||||
[Inject]
|
||||
private ISnackbar Snackbar { get; set; } = null!;
|
||||
|
||||
@@ -1,4 +1,11 @@
|
||||
@page "/datainbox"
|
||||
@using MudBlazor.Internal
|
||||
@using DiunaBI.Application.DTOModels
|
||||
@implements IDisposable
|
||||
|
||||
<PageTitle>Data Inbox</PageTitle>
|
||||
|
||||
<MudContainer MaxWidth="MaxWidth.ExtraExtraLarge">
|
||||
<MudExpansionPanels Class="mb-4">
|
||||
<MudExpansionPanel Icon="@Icons.Material.Filled.FilterList"
|
||||
Text="Filters"
|
||||
@@ -46,7 +53,7 @@
|
||||
<RowTemplate Context="row">
|
||||
<MudTd DataLabel="Name"><div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">@row.Name</div></MudTd>
|
||||
<MudTd DataLabel="Source"><div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">@row.Source</div></MudTd>
|
||||
<MudTd DataLabel="Created At"><div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">@row.CreatedAt.ToString("yyyy-MM-dd HH:mm:ss")</div></MudTd>
|
||||
<MudTd DataLabel="Created At"><div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">@DateTimeHelper.FormatDateTime(row.CreatedAt)</div></MudTd>
|
||||
</RowTemplate>
|
||||
<NoRecordsContent>
|
||||
<MudText>No data inbox items to display</MudText>
|
||||
@@ -76,3 +83,4 @@
|
||||
</MudItem>
|
||||
</MudGrid>
|
||||
}
|
||||
</MudContainer>
|
||||
@@ -6,15 +6,17 @@ using DiunaBI.Application.DTOModels.Common;
|
||||
using MudBlazor;
|
||||
using Microsoft.JSInterop;
|
||||
|
||||
namespace DiunaBI.UI.Shared.Components;
|
||||
namespace DiunaBI.UI.Shared.Pages.DataInbox;
|
||||
|
||||
public partial class DataInboxListComponent : ComponentBase
|
||||
public partial class Index : ComponentBase, IDisposable
|
||||
{
|
||||
[Inject] private DataInboxService DataInboxService { get; set; } = default!;
|
||||
[Inject] private EntityChangeHubService HubService { get; set; } = default!;
|
||||
[Inject] private ISnackbar Snackbar { get; set; } = default!;
|
||||
[Inject] private NavigationManager NavigationManager { get; set; } = default!;
|
||||
[Inject] private DataInboxFilterStateService FilterStateService { get; set; } = default!;
|
||||
[Inject] private IJSRuntime JSRuntime { get; set; } = default!;
|
||||
[Inject] private DateTimeHelper DateTimeHelper { get; set; } = default!;
|
||||
|
||||
|
||||
private PagedResult<DataInboxDto> dataInbox = new();
|
||||
@@ -23,8 +25,25 @@ public partial class DataInboxListComponent : ComponentBase
|
||||
|
||||
protected override async Task OnInitializedAsync()
|
||||
{
|
||||
await DateTimeHelper.InitializeAsync();
|
||||
filterRequest = FilterStateService.FilterRequest;
|
||||
await LoadDataInbox();
|
||||
|
||||
// Subscribe to SignalR entity changes
|
||||
HubService.EntityChanged += OnEntityChanged;
|
||||
}
|
||||
|
||||
private async void OnEntityChanged(string module, string id, string operation)
|
||||
{
|
||||
// Only react if it's a DataInbox change
|
||||
if (module.Equals("DataInbox", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
await InvokeAsync(async () =>
|
||||
{
|
||||
await LoadDataInbox();
|
||||
StateHasChanged();
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
private async Task LoadDataInbox()
|
||||
@@ -75,4 +94,9 @@ public partial class DataInboxListComponent : ComponentBase
|
||||
var url = NavigationManager.ToAbsoluteUri($"/datainbox/{dataInboxItem.Id}").ToString();
|
||||
await JSRuntime.InvokeVoidAsync("open", url, "_blank");
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
HubService.EntityChanged -= OnEntityChanged;
|
||||
}
|
||||
}
|
||||
@@ -1,8 +0,0 @@
|
||||
@page "/datainbox"
|
||||
@using DiunaBI.UI.Shared.Components
|
||||
|
||||
<PageTitle>Data Inbox</PageTitle>
|
||||
|
||||
<MudContainer MaxWidth="MaxWidth.ExtraExtraLarge">
|
||||
<DataInboxListComponent />
|
||||
</MudContainer>
|
||||
272
DiunaBI.UI.Shared/Pages/Jobs/Details.razor
Normal file
272
DiunaBI.UI.Shared/Pages/Jobs/Details.razor
Normal file
@@ -0,0 +1,272 @@
|
||||
@page "/jobs/{id:guid}"
|
||||
@using DiunaBI.UI.Shared.Services
|
||||
@using DiunaBI.Domain.Entities
|
||||
@using MudBlazor
|
||||
@inject JobService JobService
|
||||
@inject EntityChangeHubService HubService
|
||||
@inject NavigationManager NavigationManager
|
||||
@inject ISnackbar Snackbar
|
||||
@inject DateTimeHelper DateTimeHelper
|
||||
@implements IDisposable
|
||||
|
||||
<MudCard>
|
||||
<MudCardHeader>
|
||||
<CardHeaderContent>
|
||||
<MudText Typo="Typo.h5">Job Details</MudText>
|
||||
</CardHeaderContent>
|
||||
<CardHeaderActions>
|
||||
@if (job != null && job.Status == JobStatus.Failed)
|
||||
{
|
||||
<MudButton Variant="Variant.Filled"
|
||||
Color="Color.Warning"
|
||||
OnClick="RetryJob"
|
||||
StartIcon="@Icons.Material.Filled.Refresh">
|
||||
Retry
|
||||
</MudButton>
|
||||
}
|
||||
@if (job != null && (job.Status == JobStatus.Pending || job.Status == JobStatus.Retrying))
|
||||
{
|
||||
<MudButton Variant="Variant.Filled"
|
||||
Color="Color.Error"
|
||||
OnClick="CancelJob"
|
||||
StartIcon="@Icons.Material.Filled.Cancel">
|
||||
Cancel
|
||||
</MudButton>
|
||||
}
|
||||
<MudButton Variant="Variant.Text"
|
||||
OnClick="GoBack"
|
||||
StartIcon="@Icons.Material.Filled.ArrowBack">
|
||||
Back to List
|
||||
</MudButton>
|
||||
</CardHeaderActions>
|
||||
</MudCardHeader>
|
||||
<MudCardContent>
|
||||
@if (isLoading)
|
||||
{
|
||||
<MudProgressLinear Color="Color.Primary" Indeterminate="true" />
|
||||
}
|
||||
else if (job == null)
|
||||
{
|
||||
<MudAlert Severity="Severity.Error">Job not found</MudAlert>
|
||||
}
|
||||
else
|
||||
{
|
||||
<MudGrid>
|
||||
<MudItem xs="12" md="6">
|
||||
<MudTextField Value="@job.LayerName"
|
||||
Label="Layer Name"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
FullWidth="true"/>
|
||||
</MudItem>
|
||||
<MudItem xs="12" md="6">
|
||||
<MudTextField Value="@job.PluginName"
|
||||
Label="Plugin Name"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
FullWidth="true"/>
|
||||
</MudItem>
|
||||
|
||||
<MudItem xs="12" md="4">
|
||||
<MudTextField Value="@job.JobType.ToString()"
|
||||
Label="Job Type"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
FullWidth="true"/>
|
||||
</MudItem>
|
||||
<MudItem xs="12" md="4">
|
||||
<MudTextField Value="@job.Status.ToString()"
|
||||
Label="Status"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
FullWidth="true"
|
||||
Adornment="Adornment.Start"
|
||||
AdornmentIcon="@GetStatusIcon(job.Status)"
|
||||
AdornmentColor="@GetStatusColor(job.Status)"/>
|
||||
</MudItem>
|
||||
<MudItem xs="12" md="4">
|
||||
<MudTextField Value="@job.Priority.ToString()"
|
||||
Label="Priority"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
FullWidth="true"/>
|
||||
</MudItem>
|
||||
|
||||
<MudItem xs="12" md="6">
|
||||
<MudTextField Value="@DateTimeHelper.FormatDateTime(job.CreatedAt)"
|
||||
Label="Created At"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
FullWidth="true"/>
|
||||
</MudItem>
|
||||
<MudItem xs="12" md="6">
|
||||
<MudTextField Value="@DateTimeHelper.FormatDateTime(job.LastAttemptAt)"
|
||||
Label="Last Attempt At"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
FullWidth="true"/>
|
||||
</MudItem>
|
||||
|
||||
<MudItem xs="12" md="6">
|
||||
<MudTextField Value="@DateTimeHelper.FormatDateTime(job.CompletedAt)"
|
||||
Label="Completed At"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
FullWidth="true"/>
|
||||
</MudItem>
|
||||
<MudItem xs="12" md="6">
|
||||
<MudTextField Value="@($"{job.RetryCount} / {job.MaxRetries}")"
|
||||
Label="Retry Count / Max Retries"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
FullWidth="true"/>
|
||||
</MudItem>
|
||||
|
||||
@if (!string.IsNullOrEmpty(job.LastError))
|
||||
{
|
||||
<MudItem xs="12">
|
||||
<MudTextField Value="@job.LastError"
|
||||
Label="Last Error"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
FullWidth="true"
|
||||
Lines="5"
|
||||
AdornmentIcon="@Icons.Material.Filled.Error"
|
||||
AdornmentColor="Color.Error"/>
|
||||
</MudItem>
|
||||
}
|
||||
|
||||
<MudItem xs="12">
|
||||
<MudDivider Class="my-4"/>
|
||||
</MudItem>
|
||||
|
||||
<MudItem xs="12">
|
||||
<MudButton Variant="Variant.Outlined"
|
||||
Color="Color.Primary"
|
||||
OnClick="@(() => NavigationManager.NavigateTo($"/layers/{job.LayerId}"))"
|
||||
StartIcon="@Icons.Material.Filled.Layers">
|
||||
View Layer Details
|
||||
</MudButton>
|
||||
</MudItem>
|
||||
</MudGrid>
|
||||
}
|
||||
</MudCardContent>
|
||||
</MudCard>
|
||||
|
||||
@code {
|
||||
[Parameter]
|
||||
public Guid Id { get; set; }
|
||||
|
||||
private QueueJob? job;
|
||||
private bool isLoading = true;
|
||||
|
||||
protected override async Task OnInitializedAsync()
|
||||
{
|
||||
await DateTimeHelper.InitializeAsync();
|
||||
await LoadJob();
|
||||
|
||||
// Subscribe to SignalR entity changes
|
||||
HubService.EntityChanged += OnEntityChanged;
|
||||
}
|
||||
|
||||
private async void OnEntityChanged(string module, string id, string operation)
|
||||
{
|
||||
// Only react if it's a QueueJobs change for this specific job
|
||||
if (module.Equals("QueueJobs", StringComparison.OrdinalIgnoreCase) &&
|
||||
Guid.TryParse(id, out var jobId) && jobId == Id)
|
||||
{
|
||||
Console.WriteLine($"📨 Job {jobId} changed, refreshing detail page");
|
||||
await InvokeAsync(async () =>
|
||||
{
|
||||
await LoadJob();
|
||||
StateHasChanged();
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
private async Task LoadJob()
|
||||
{
|
||||
isLoading = true;
|
||||
try
|
||||
{
|
||||
job = await JobService.GetJobByIdAsync(Id);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Console.WriteLine($"Loading job failed: {ex.Message}");
|
||||
Snackbar.Add("Failed to load job", Severity.Error);
|
||||
}
|
||||
finally
|
||||
{
|
||||
isLoading = false;
|
||||
}
|
||||
}
|
||||
|
||||
private async Task RetryJob()
|
||||
{
|
||||
if (job == null) return;
|
||||
|
||||
var success = await JobService.RetryJobAsync(job.Id);
|
||||
if (success)
|
||||
{
|
||||
Snackbar.Add("Job reset to Pending status", Severity.Success);
|
||||
await LoadJob();
|
||||
}
|
||||
else
|
||||
{
|
||||
Snackbar.Add("Failed to retry job", Severity.Error);
|
||||
}
|
||||
}
|
||||
|
||||
private async Task CancelJob()
|
||||
{
|
||||
if (job == null) return;
|
||||
|
||||
var success = await JobService.CancelJobAsync(job.Id);
|
||||
if (success)
|
||||
{
|
||||
Snackbar.Add("Job cancelled", Severity.Success);
|
||||
await LoadJob();
|
||||
}
|
||||
else
|
||||
{
|
||||
Snackbar.Add("Failed to cancel job", Severity.Error);
|
||||
}
|
||||
}
|
||||
|
||||
private void GoBack()
|
||||
{
|
||||
NavigationManager.NavigateTo("/jobs");
|
||||
}
|
||||
|
||||
private Color GetStatusColor(JobStatus status)
|
||||
{
|
||||
return status switch
|
||||
{
|
||||
JobStatus.Pending => Color.Default,
|
||||
JobStatus.Running => Color.Info,
|
||||
JobStatus.Completed => Color.Success,
|
||||
JobStatus.Failed => Color.Error,
|
||||
JobStatus.Retrying => Color.Warning,
|
||||
_ => Color.Default
|
||||
};
|
||||
}
|
||||
|
||||
private string GetStatusIcon(JobStatus status)
|
||||
{
|
||||
return status switch
|
||||
{
|
||||
JobStatus.Pending => Icons.Material.Filled.HourglassEmpty,
|
||||
JobStatus.Running => Icons.Material.Filled.PlayArrow,
|
||||
JobStatus.Completed => Icons.Material.Filled.CheckCircle,
|
||||
JobStatus.Failed => Icons.Material.Filled.Error,
|
||||
JobStatus.Retrying => Icons.Material.Filled.Refresh,
|
||||
_ => Icons.Material.Filled.Help
|
||||
};
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
HubService.EntityChanged -= OnEntityChanged;
|
||||
}
|
||||
}
|
||||
172
DiunaBI.UI.Shared/Pages/Jobs/Index.razor
Normal file
172
DiunaBI.UI.Shared/Pages/Jobs/Index.razor
Normal file
@@ -0,0 +1,172 @@
|
||||
@page "/jobs"
|
||||
@using MudBlazor.Internal
|
||||
@using DiunaBI.Domain.Entities
|
||||
@implements IDisposable
|
||||
|
||||
<PageTitle>Jobs</PageTitle>
|
||||
|
||||
<MudContainer MaxWidth="MaxWidth.ExtraExtraLarge">
|
||||
<MudExpansionPanels Class="mb-4">
|
||||
<MudExpansionPanel Icon="@Icons.Material.Filled.FilterList"
|
||||
Text="Filters"
|
||||
Expanded="true">
|
||||
<MudGrid AlignItems="Center">
|
||||
<MudItem xs="12" sm="6" md="3">
|
||||
<MudSelect T="JobStatus"
|
||||
SelectedValues="selectedStatuses"
|
||||
Label="Status"
|
||||
Placeholder="All statuses"
|
||||
MultiSelection="true"
|
||||
Clearable="true"
|
||||
SelectedValuesChanged="OnStatusFilterChanged"
|
||||
OnClearButtonClick="OnStatusClear">
|
||||
@foreach (JobStatus status in Enum.GetValues(typeof(JobStatus)))
|
||||
{
|
||||
<MudSelectItem T="JobStatus" Value="@status">@status.ToString()</MudSelectItem>
|
||||
}
|
||||
</MudSelect>
|
||||
</MudItem>
|
||||
|
||||
<MudItem xs="12" sm="6" md="3">
|
||||
<MudSelect T="JobType?"
|
||||
Value="selectedJobType"
|
||||
Label="Job Type"
|
||||
Placeholder="All types"
|
||||
Clearable="true"
|
||||
ValueChanged="OnJobTypeFilterChanged"
|
||||
OnClearButtonClick="OnJobTypeClear">
|
||||
@foreach (JobType type in Enum.GetValues(typeof(JobType)))
|
||||
{
|
||||
<MudSelectItem T="JobType?" Value="@type">@type.ToString()</MudSelectItem>
|
||||
}
|
||||
</MudSelect>
|
||||
</MudItem>
|
||||
|
||||
<MudItem xs="12" sm="12" md="6" Class="d-flex justify-end align-center gap-2">
|
||||
<MudMenu Icon="@Icons.Material.Filled.PlayArrow"
|
||||
Label="Schedule Jobs"
|
||||
Variant="Variant.Filled"
|
||||
Color="Color.Success"
|
||||
Size="Size.Medium"
|
||||
EndIcon="@Icons.Material.Filled.KeyboardArrowDown">
|
||||
<MudMenuItem OnClick="@(() => ScheduleJobs("all"))">
|
||||
<div class="d-flex align-center">
|
||||
<MudIcon Icon="@Icons.Material.Filled.PlayCircle" Class="mr-2" />
|
||||
<span>Run All Jobs</span>
|
||||
</div>
|
||||
</MudMenuItem>
|
||||
<MudMenuItem OnClick="@(() => ScheduleJobs("imports"))">
|
||||
<div class="d-flex align-center">
|
||||
<MudIcon Icon="@Icons.Material.Filled.FileDownload" Class="mr-2" />
|
||||
<span>Run All Imports</span>
|
||||
</div>
|
||||
</MudMenuItem>
|
||||
<MudMenuItem OnClick="@(() => ScheduleJobs("processes"))">
|
||||
<div class="d-flex align-center">
|
||||
<MudIcon Icon="@Icons.Material.Filled.Settings" Class="mr-2" />
|
||||
<span>Run All Processes</span>
|
||||
</div>
|
||||
</MudMenuItem>
|
||||
</MudMenu>
|
||||
|
||||
<MudIconButton Icon="@Icons.Material.Filled.Clear"
|
||||
OnClick="ClearFilters"
|
||||
Color="Color.Default"
|
||||
Size="Size.Medium"
|
||||
Title="Clear filters"/>
|
||||
</MudItem>
|
||||
</MudGrid>
|
||||
</MudExpansionPanel>
|
||||
</MudExpansionPanels>
|
||||
|
||||
<MudDivider Class="my-4"></MudDivider>
|
||||
|
||||
<MudTable Items="jobs.Items"
|
||||
Dense="true"
|
||||
Hover="true"
|
||||
Loading="isLoading"
|
||||
LoadingProgressColor="Color.Primary"
|
||||
OnRowClick="@((TableRowClickEventArgs<QueueJob> args) => OnRowClick(args.Item))"
|
||||
T="QueueJob"
|
||||
Style="cursor: pointer;">
|
||||
<HeaderContent>
|
||||
<MudTh>Layer Name</MudTh>
|
||||
<MudTh>Plugin</MudTh>
|
||||
<MudTh>Type</MudTh>
|
||||
<MudTh>Status</MudTh>
|
||||
<MudTh>Priority</MudTh>
|
||||
<MudTh>Retry</MudTh>
|
||||
<MudTh>Created</MudTh>
|
||||
<MudTh>Last Attempt</MudTh>
|
||||
</HeaderContent>
|
||||
<RowTemplate Context="row">
|
||||
<MudTd DataLabel="Layer Name">
|
||||
<div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">
|
||||
@row.LayerName
|
||||
</div>
|
||||
</MudTd>
|
||||
<MudTd DataLabel="Plugin">
|
||||
<div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">
|
||||
@row.PluginName
|
||||
</div>
|
||||
</MudTd>
|
||||
<MudTd DataLabel="Type">
|
||||
<div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">
|
||||
<MudChip T="string" Size="Size.Small" Color="@GetJobTypeColor(row.JobType)">@row.JobType</MudChip>
|
||||
</div>
|
||||
</MudTd>
|
||||
<MudTd DataLabel="Status">
|
||||
<div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">
|
||||
<MudChip T="string" Size="Size.Small" Color="@GetStatusColor(row.Status)">@row.Status</MudChip>
|
||||
</div>
|
||||
</MudTd>
|
||||
<MudTd DataLabel="Priority">
|
||||
<div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">
|
||||
@row.Priority
|
||||
</div>
|
||||
</MudTd>
|
||||
<MudTd DataLabel="Retry">
|
||||
<div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">
|
||||
@row.RetryCount / @row.MaxRetries
|
||||
</div>
|
||||
</MudTd>
|
||||
<MudTd DataLabel="Created">
|
||||
<div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">
|
||||
@DateTimeHelper.FormatDateTime(row.CreatedAt, "yyyy-MM-dd HH:mm")
|
||||
</div>
|
||||
</MudTd>
|
||||
<MudTd DataLabel="Last Attempt">
|
||||
<div @oncontextmenu="@(async (e) => await OnRowRightClick(e, row))" @oncontextmenu:preventDefault="true">
|
||||
@DateTimeHelper.FormatDateTime(row.LastAttemptAt, "yyyy-MM-dd HH:mm")
|
||||
</div>
|
||||
</MudTd>
|
||||
</RowTemplate>
|
||||
<NoRecordsContent>
|
||||
<MudText>No jobs to display</MudText>
|
||||
</NoRecordsContent>
|
||||
<LoadingContent>
|
||||
Loading...
|
||||
</LoadingContent>
|
||||
</MudTable>
|
||||
|
||||
@if (jobs.TotalCount > 0)
|
||||
{
|
||||
<MudGrid Class="mt-4" AlignItems="Center.Center">
|
||||
<MudItem xs="12" sm="6">
|
||||
<MudText Typo="Typo.body2">
|
||||
Results @((jobs.Page - 1) * jobs.PageSize + 1) - @Math.Min(jobs.Page * jobs.PageSize, jobs.TotalCount)
|
||||
of @jobs.TotalCount
|
||||
</MudText>
|
||||
</MudItem>
|
||||
<MudItem xs="12" sm="6" Class="d-flex justify-end">
|
||||
<MudPagination Count="jobs.TotalPages"
|
||||
Selected="jobs.Page"
|
||||
SelectedChanged="OnPageChanged"
|
||||
ShowFirstButton="true"
|
||||
ShowLastButton="true"
|
||||
Variant="Variant.Outlined"
|
||||
/>
|
||||
</MudItem>
|
||||
</MudGrid>
|
||||
}
|
||||
</MudContainer>
|
||||
194
DiunaBI.UI.Shared/Pages/Jobs/Index.razor.cs
Normal file
194
DiunaBI.UI.Shared/Pages/Jobs/Index.razor.cs
Normal file
@@ -0,0 +1,194 @@
|
||||
using DiunaBI.UI.Shared.Services;
|
||||
using Microsoft.AspNetCore.Components;
|
||||
using Microsoft.AspNetCore.Components.Web;
|
||||
using DiunaBI.Application.DTOModels.Common;
|
||||
using DiunaBI.Domain.Entities;
|
||||
using MudBlazor;
|
||||
using Microsoft.JSInterop;
|
||||
|
||||
namespace DiunaBI.UI.Shared.Pages.Jobs;
|
||||
|
||||
public partial class Index : ComponentBase, IDisposable
|
||||
{
|
||||
[Inject] private JobService JobService { get; set; } = default!;
|
||||
[Inject] private EntityChangeHubService HubService { get; set; } = default!;
|
||||
[Inject] private ISnackbar Snackbar { get; set; } = default!;
|
||||
[Inject] private NavigationManager NavigationManager { get; set; } = default!;
|
||||
[Inject] private IJSRuntime JSRuntime { get; set; } = default!;
|
||||
[Inject] private DateTimeHelper DateTimeHelper { get; set; } = default!;
|
||||
|
||||
private PagedResult<QueueJob> jobs = new();
|
||||
private bool isLoading = false;
|
||||
private int currentPage = 1;
|
||||
private int pageSize = 50;
|
||||
private IEnumerable<JobStatus> selectedStatuses = new HashSet<JobStatus>();
|
||||
private JobType? selectedJobType = null;
|
||||
|
||||
protected override async Task OnInitializedAsync()
|
||||
{
|
||||
await DateTimeHelper.InitializeAsync();
|
||||
await LoadJobs();
|
||||
|
||||
// Subscribe to SignalR entity changes
|
||||
HubService.EntityChanged += OnEntityChanged;
|
||||
}
|
||||
|
||||
private async void OnEntityChanged(string module, string id, string operation)
|
||||
{
|
||||
Console.WriteLine($"🔔 JobListComponent.OnEntityChanged called: module={module}, id={id}, operation={operation}");
|
||||
|
||||
// Only react if it's a QueueJobs change
|
||||
if (module.Equals("QueueJobs", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
Console.WriteLine($"📨 Job {id} changed, refreshing job list");
|
||||
await InvokeAsync(async () =>
|
||||
{
|
||||
Console.WriteLine($"🔄 LoadJobs starting...");
|
||||
await LoadJobs();
|
||||
Console.WriteLine($"🔄 StateHasChanged calling...");
|
||||
StateHasChanged();
|
||||
Console.WriteLine($"✅ Job list refresh complete");
|
||||
});
|
||||
}
|
||||
else
|
||||
{
|
||||
Console.WriteLine($"⏭️ Skipping - module '{module}' is not QueueJobs");
|
||||
}
|
||||
}
|
||||
|
||||
private async Task LoadJobs()
|
||||
{
|
||||
isLoading = true;
|
||||
|
||||
try
|
||||
{
|
||||
var statusList = selectedStatuses?.ToList();
|
||||
jobs = await JobService.GetJobsAsync(currentPage, pageSize, statusList, selectedJobType);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Console.WriteLine($"Loading jobs failed: {ex.Message}");
|
||||
Snackbar.Add("Failed to load jobs", Severity.Error);
|
||||
}
|
||||
finally
|
||||
{
|
||||
isLoading = false;
|
||||
}
|
||||
}
|
||||
|
||||
private async Task OnPageChanged(int page)
|
||||
{
|
||||
currentPage = page;
|
||||
await LoadJobs();
|
||||
}
|
||||
|
||||
private async Task ClearFilters()
|
||||
{
|
||||
selectedStatuses = new HashSet<JobStatus>();
|
||||
selectedJobType = null;
|
||||
currentPage = 1;
|
||||
await LoadJobs();
|
||||
}
|
||||
|
||||
private async Task OnStatusFilterChanged(IEnumerable<JobStatus> values)
|
||||
{
|
||||
selectedStatuses = values;
|
||||
currentPage = 1;
|
||||
await LoadJobs();
|
||||
}
|
||||
|
||||
private async Task OnJobTypeFilterChanged(JobType? value)
|
||||
{
|
||||
selectedJobType = value;
|
||||
currentPage = 1;
|
||||
await LoadJobs();
|
||||
}
|
||||
|
||||
private async Task OnStatusClear()
|
||||
{
|
||||
selectedStatuses = new HashSet<JobStatus>();
|
||||
currentPage = 1;
|
||||
await LoadJobs();
|
||||
}
|
||||
|
||||
private async Task OnJobTypeClear()
|
||||
{
|
||||
selectedJobType = null;
|
||||
currentPage = 1;
|
||||
await LoadJobs();
|
||||
}
|
||||
|
||||
private void OnRowClick(QueueJob job)
|
||||
{
|
||||
NavigationManager.NavigateTo($"/jobs/{job.Id}");
|
||||
}
|
||||
|
||||
private async Task OnRowRightClick(MouseEventArgs e, QueueJob job)
|
||||
{
|
||||
var url = NavigationManager.ToAbsoluteUri($"/jobs/{job.Id}").ToString();
|
||||
await JSRuntime.InvokeVoidAsync("open", url, "_blank");
|
||||
}
|
||||
|
||||
private async Task ScheduleJobs(string type)
|
||||
{
|
||||
isLoading = true;
|
||||
|
||||
try
|
||||
{
|
||||
(bool success, int jobsCreated, string message) result = type switch
|
||||
{
|
||||
"all" => await JobService.ScheduleAllJobsAsync(),
|
||||
"imports" => await JobService.ScheduleImportJobsAsync(),
|
||||
"processes" => await JobService.ScheduleProcessJobsAsync(),
|
||||
_ => (false, 0, "Unknown job type")
|
||||
};
|
||||
|
||||
if (result.success)
|
||||
{
|
||||
Snackbar.Add($"{result.message} ({result.jobsCreated} jobs created)", Severity.Success);
|
||||
await LoadJobs();
|
||||
}
|
||||
else
|
||||
{
|
||||
Snackbar.Add(result.message, Severity.Error);
|
||||
}
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Console.WriteLine($"Scheduling jobs failed: {ex.Message}");
|
||||
Snackbar.Add($"Failed to schedule jobs: {ex.Message}", Severity.Error);
|
||||
}
|
||||
finally
|
||||
{
|
||||
isLoading = false;
|
||||
}
|
||||
}
|
||||
|
||||
private Color GetStatusColor(JobStatus status)
|
||||
{
|
||||
return status switch
|
||||
{
|
||||
JobStatus.Pending => Color.Default,
|
||||
JobStatus.Running => Color.Info,
|
||||
JobStatus.Completed => Color.Success,
|
||||
JobStatus.Failed => Color.Error,
|
||||
JobStatus.Retrying => Color.Warning,
|
||||
_ => Color.Default
|
||||
};
|
||||
}
|
||||
|
||||
private Color GetJobTypeColor(JobType jobType)
|
||||
{
|
||||
return jobType switch
|
||||
{
|
||||
JobType.Import => Color.Primary,
|
||||
JobType.Process => Color.Secondary,
|
||||
_ => Color.Default
|
||||
};
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
HubService.EntityChanged -= OnEntityChanged;
|
||||
}
|
||||
}
|
||||
@@ -1,8 +0,0 @@
|
||||
@page "/layers"
|
||||
@using DiunaBI.UI.Shared.Components
|
||||
|
||||
<PageTitle>Layers</PageTitle>
|
||||
|
||||
<MudContainer MaxWidth="MaxWidth.ExtraExtraLarge">
|
||||
<LayerListComponent />
|
||||
</MudContainer>
|
||||
@@ -2,8 +2,7 @@
|
||||
@using DiunaBI.UI.Shared.Services
|
||||
@using DiunaBI.Application.DTOModels
|
||||
@using MudBlazor
|
||||
@inject LayerService LayerService
|
||||
@inject NavigationManager NavigationManager
|
||||
@implements IDisposable
|
||||
|
||||
<MudCard>
|
||||
<MudCardHeader>
|
||||
@@ -11,18 +10,24 @@
|
||||
<MudText Typo="Typo.h5">Layer Details</MudText>
|
||||
</CardHeaderContent>
|
||||
<CardHeaderActions>
|
||||
<!--
|
||||
<MudButton Variant="Variant.Text" OnClick="Export">Export</MudButton>
|
||||
@if (layer != null && layer.Type == LayerType.Administration)
|
||||
@if (layer != null && layer.Type == LayerType.Administration && IsWorkerLayer())
|
||||
{
|
||||
<MudButton Variant="Variant.Text" Href="@($"/layers/edit/{layer.Id}/duplicate")">Duplicate</MudButton>
|
||||
<MudButton Variant="Variant.Text" Href="@($"/layers/edit/{layer.Id}")">Edit</MudButton>
|
||||
}
|
||||
@if (layer != null && layer.Type == LayerType.Processed)
|
||||
<MudButton Variant="Variant.Filled"
|
||||
Color="Color.Primary"
|
||||
OnClick="RunNow"
|
||||
Disabled="isRunningJob"
|
||||
StartIcon="@Icons.Material.Filled.PlayArrow">
|
||||
@if (isRunningJob)
|
||||
{
|
||||
<MudButton Variant="Variant.Text" OnClick="ProcessLayer">Process Layer</MudButton>
|
||||
<MudProgressCircular Size="Size.Small" Indeterminate="true"/>
|
||||
<span style="margin-left: 8px;">Creating Job...</span>
|
||||
}
|
||||
else
|
||||
{
|
||||
<span>Run Now</span>
|
||||
}
|
||||
</MudButton>
|
||||
}
|
||||
-->
|
||||
<MudButton Variant="Variant.Text" OnClick="GoBack" StartIcon="@Icons.Material.Filled.ArrowBack">Back to List</MudButton>
|
||||
</CardHeaderActions>
|
||||
</MudCardHeader>
|
||||
@@ -54,7 +59,7 @@
|
||||
}
|
||||
</MudItem>
|
||||
<MudItem xs="12" md="6">
|
||||
<MudTextField Value="@layer.CreatedAt.ToString("g")"
|
||||
<MudTextField Value="@DateTimeHelper.FormatDateTime(layer.CreatedAt, "yyyy-MM-dd HH:mm")"
|
||||
Label="Created"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
@@ -63,7 +68,7 @@
|
||||
AdornmentText="@(layer.CreatedBy?.Username ?? "")"/>
|
||||
</MudItem>
|
||||
<MudItem xs="12" md="6">
|
||||
<MudTextField Value="@layer.ModifiedAt.ToString("g")"
|
||||
<MudTextField Value="@DateTimeHelper.FormatDateTime(layer.ModifiedAt, "yyyy-MM-dd HH:mm")"
|
||||
Label="Modified"
|
||||
Variant="Variant.Outlined"
|
||||
ReadOnly="true"
|
||||
@@ -158,12 +163,14 @@
|
||||
}
|
||||
</RowTemplate>
|
||||
<FooterContent>
|
||||
<MudTd><b>Value1 sum</b></MudTd>
|
||||
@if (showSummary)
|
||||
{
|
||||
<MudTd><b>@totalSum.ToString("N2")</b></MudTd>
|
||||
@foreach (var column in displayedColumns)
|
||||
{
|
||||
@if (column == "Value1")
|
||||
@if (column.StartsWith("Value") && columnSums.ContainsKey(column))
|
||||
{
|
||||
<MudTd><b>@valueSum.ToString("N2")</b></MudTd>
|
||||
<MudTd><b>@columnSums[column].ToString("N2")</b></MudTd>
|
||||
}
|
||||
else
|
||||
{
|
||||
@@ -174,6 +181,7 @@
|
||||
{
|
||||
<MudTd></MudTd>
|
||||
}
|
||||
}
|
||||
</FooterContent>
|
||||
</MudTable>
|
||||
|
||||
@@ -226,6 +234,8 @@
|
||||
}
|
||||
</MudTabPanel>
|
||||
|
||||
@if (showHistoryTab)
|
||||
{
|
||||
<MudTabPanel Text="History" Icon="@Icons.Material.Filled.History">
|
||||
@if (isLoadingHistory)
|
||||
{
|
||||
@@ -307,7 +317,7 @@
|
||||
<RowTemplate>
|
||||
<MudTd DataLabel="Code">@context.Code</MudTd>
|
||||
<MudTd DataLabel="Description">@context.Desc1</MudTd>
|
||||
<MudTd DataLabel="Modified">@context.ModifiedAt.ToString("g")</MudTd>
|
||||
<MudTd DataLabel="Modified">@DateTimeHelper.FormatDateTime(context.ModifiedAt, "yyyy-MM-dd HH:mm")</MudTd>
|
||||
<MudTd DataLabel="Modified By">@GetModifiedByUsername(context.ModifiedById)</MudTd>
|
||||
</RowTemplate>
|
||||
</MudTable>
|
||||
@@ -350,6 +360,7 @@
|
||||
}
|
||||
}
|
||||
</MudTabPanel>
|
||||
}
|
||||
</MudTabs>
|
||||
}
|
||||
</MudCardContent>
|
||||
@@ -1,31 +1,51 @@
|
||||
using DiunaBI.Application.DTOModels;
|
||||
using DiunaBI.UI.Shared.Services;
|
||||
using Microsoft.AspNetCore.Components;
|
||||
using MudBlazor;
|
||||
using System.Reflection;
|
||||
|
||||
namespace DiunaBI.UI.Shared.Pages;
|
||||
namespace DiunaBI.UI.Shared.Pages.Layers;
|
||||
|
||||
public partial class LayerDetailPage : ComponentBase
|
||||
public partial class Details : ComponentBase, IDisposable
|
||||
{
|
||||
[Parameter]
|
||||
public Guid Id { get; set; }
|
||||
|
||||
[Inject]
|
||||
private IDialogService DialogService { get; set; } = null!;
|
||||
|
||||
[Inject]
|
||||
private LayerService LayerService { get; set; } = null!;
|
||||
|
||||
[Inject]
|
||||
private JobService JobService { get; set; } = null!;
|
||||
|
||||
[Inject]
|
||||
private EntityChangeHubService HubService { get; set; } = null!;
|
||||
|
||||
[Inject]
|
||||
private NavigationManager NavigationManager { get; set; } = null!;
|
||||
|
||||
[Inject]
|
||||
private ISnackbar Snackbar { get; set; } = null!;
|
||||
|
||||
[Inject]
|
||||
private IDialogService DialogService { get; set; } = null!;
|
||||
private DateTimeHelper DateTimeHelper { get; set; } = null!;
|
||||
|
||||
private LayerDto? layer;
|
||||
private List<RecordDto> records = new();
|
||||
private List<string> displayedColumns = new();
|
||||
private double valueSum = 0;
|
||||
private Dictionary<string, double> columnSums = new();
|
||||
private double totalSum = 0;
|
||||
private bool isLoading = false;
|
||||
private Guid? editingRecordId = null;
|
||||
private RecordDto? editingRecord = null;
|
||||
private bool isAddingNew = false;
|
||||
private RecordDto newRecord = new();
|
||||
private bool isEditable => layer?.Type == LayerType.Dictionary || layer?.Type == LayerType.Administration;
|
||||
private bool showHistoryTab => layer?.Type == LayerType.Administration || layer?.Type == LayerType.Dictionary;
|
||||
private bool showSummary => layer?.Type == LayerType.Import || layer?.Type == LayerType.Processed;
|
||||
|
||||
// History tab state
|
||||
private bool isLoadingHistory = false;
|
||||
@@ -38,7 +58,41 @@ public partial class LayerDetailPage : ComponentBase
|
||||
|
||||
protected override async Task OnInitializedAsync()
|
||||
{
|
||||
await DateTimeHelper.InitializeAsync();
|
||||
await LoadLayer();
|
||||
|
||||
// Subscribe to SignalR entity changes
|
||||
HubService.EntityChanged += OnEntityChanged;
|
||||
}
|
||||
|
||||
private async void OnEntityChanged(string module, string id, string operation)
|
||||
{
|
||||
// React to Layers or Records changes for this layer
|
||||
if (module.Equals("Layers", StringComparison.OrdinalIgnoreCase) ||
|
||||
module.Equals("Records", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
// Check if it's this layer or its records that changed
|
||||
if (Guid.TryParse(id, out var changedId))
|
||||
{
|
||||
if (module.Equals("Layers", StringComparison.OrdinalIgnoreCase) && changedId == Id)
|
||||
{
|
||||
await InvokeAsync(async () =>
|
||||
{
|
||||
await LoadLayer();
|
||||
StateHasChanged();
|
||||
});
|
||||
}
|
||||
else if (module.Equals("Records", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
// For records, we reload to get the latest data
|
||||
await InvokeAsync(async () =>
|
||||
{
|
||||
await LoadLayer();
|
||||
StateHasChanged();
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
protected override async Task OnParametersSetAsync()
|
||||
@@ -67,9 +121,9 @@ public partial class LayerDetailPage : ComponentBase
|
||||
|
||||
if (layer != null && layer.Records != null)
|
||||
{
|
||||
records = layer.Records;
|
||||
records = layer.Records.OrderBy(r => r.Code).ToList();
|
||||
CalculateDisplayedColumns();
|
||||
CalculateValueSum();
|
||||
CalculateColumnSums();
|
||||
BuildUserCache();
|
||||
}
|
||||
}
|
||||
@@ -108,11 +162,25 @@ public partial class LayerDetailPage : ComponentBase
|
||||
}
|
||||
}
|
||||
|
||||
private void CalculateValueSum()
|
||||
private void CalculateColumnSums()
|
||||
{
|
||||
valueSum = records
|
||||
.Where(r => r.Value1.HasValue)
|
||||
.Sum(r => r.Value1!.Value);
|
||||
columnSums.Clear();
|
||||
totalSum = 0;
|
||||
|
||||
// Calculate sum for each displayed value column
|
||||
foreach (var columnName in displayedColumns.Where(c => c.StartsWith("Value")))
|
||||
{
|
||||
var sum = records
|
||||
.Select(r => GetRecordValueByName(r, columnName))
|
||||
.Where(v => v.HasValue)
|
||||
.Sum(v => v!.Value);
|
||||
|
||||
columnSums[columnName] = sum;
|
||||
totalSum += sum;
|
||||
}
|
||||
|
||||
// Keep valueSum for backward compatibility (Value1 sum)
|
||||
valueSum = columnSums.ContainsKey("Value1") ? columnSums["Value1"] : 0;
|
||||
}
|
||||
|
||||
private string GetRecordValue(RecordDto record, string columnName)
|
||||
@@ -236,7 +304,7 @@ public partial class LayerDetailPage : ComponentBase
|
||||
{
|
||||
records.Remove(record);
|
||||
CalculateDisplayedColumns();
|
||||
CalculateValueSum();
|
||||
CalculateColumnSums();
|
||||
Snackbar.Add("Record deleted successfully", Severity.Success);
|
||||
}
|
||||
else
|
||||
@@ -287,7 +355,7 @@ public partial class LayerDetailPage : ComponentBase
|
||||
{
|
||||
records.Add(created);
|
||||
CalculateDisplayedColumns();
|
||||
CalculateValueSum();
|
||||
CalculateColumnSums();
|
||||
isAddingNew = false;
|
||||
newRecord = new();
|
||||
Snackbar.Add("Record added successfully", Severity.Success);
|
||||
@@ -413,4 +481,59 @@ public partial class LayerDetailPage : ComponentBase
|
||||
{
|
||||
return userCache.TryGetValue(userId, out var username) ? username : string.Empty;
|
||||
}
|
||||
|
||||
// Run Now button methods
|
||||
private bool isRunningJob = false;
|
||||
|
||||
private bool IsWorkerLayer()
|
||||
{
|
||||
if (layer?.Records == null) return false;
|
||||
|
||||
var typeRecord = layer.Records.FirstOrDefault(x => x.Code == "Type");
|
||||
return typeRecord?.Desc1 == "ImportWorker" || typeRecord?.Desc1 == "ProcessWorker";
|
||||
}
|
||||
|
||||
private async Task RunNow()
|
||||
{
|
||||
if (layer == null) return;
|
||||
|
||||
isRunningJob = true;
|
||||
try
|
||||
{
|
||||
var result = await JobService.CreateJobForLayerAsync(layer.Id);
|
||||
|
||||
if (result != null && result.Success)
|
||||
{
|
||||
if (result.Existing)
|
||||
{
|
||||
Snackbar.Add($"Job already exists: {result.Message}", Severity.Info);
|
||||
}
|
||||
else
|
||||
{
|
||||
Snackbar.Add("Job created successfully! Watch real-time status updates.", Severity.Success);
|
||||
}
|
||||
|
||||
// Navigate to job detail page to see real-time updates
|
||||
NavigationManager.NavigateTo($"/jobs/{result.JobId}");
|
||||
}
|
||||
else
|
||||
{
|
||||
Snackbar.Add("Failed to create job", Severity.Error);
|
||||
}
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Console.WriteLine($"Error creating job: {ex.Message}");
|
||||
Snackbar.Add($"Error creating job: {ex.Message}", Severity.Error);
|
||||
}
|
||||
finally
|
||||
{
|
||||
isRunningJob = false;
|
||||
}
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
HubService.EntityChanged -= OnEntityChanged;
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,11 @@
|
||||
@page "/layers"
|
||||
@using MudBlazor.Internal
|
||||
@using DiunaBI.Application.DTOModels
|
||||
@implements IDisposable
|
||||
|
||||
<PageTitle>Layers</PageTitle>
|
||||
|
||||
<MudContainer MaxWidth="MaxWidth.ExtraExtraLarge">
|
||||
<MudExpansionPanels Class="mb-4">
|
||||
<MudExpansionPanel Icon="@Icons.Material.Filled.FilterList"
|
||||
Text="Filters"
|
||||
@@ -87,3 +93,4 @@
|
||||
</MudItem>
|
||||
</MudGrid>
|
||||
}
|
||||
</MudContainer>
|
||||
@@ -6,11 +6,12 @@ using DiunaBI.Application.DTOModels.Common;
|
||||
using MudBlazor;
|
||||
using Microsoft.JSInterop;
|
||||
|
||||
namespace DiunaBI.UI.Shared.Components;
|
||||
namespace DiunaBI.UI.Shared.Pages.Layers;
|
||||
|
||||
public partial class LayerListComponent : ComponentBase
|
||||
public partial class Index : ComponentBase, IDisposable
|
||||
{
|
||||
[Inject] private LayerService LayerService { get; set; } = default!;
|
||||
[Inject] private EntityChangeHubService HubService { get; set; } = default!;
|
||||
[Inject] private ISnackbar Snackbar { get; set; } = default!;
|
||||
[Inject] private NavigationManager NavigationManager { get; set; } = default!;
|
||||
[Inject] private LayerFilterStateService FilterStateService { get; set; } = default!;
|
||||
@@ -25,6 +26,22 @@ public partial class LayerListComponent : ComponentBase
|
||||
{
|
||||
filterRequest = FilterStateService.FilterRequest;
|
||||
await LoadLayers();
|
||||
|
||||
// Subscribe to SignalR entity changes
|
||||
HubService.EntityChanged += OnEntityChanged;
|
||||
}
|
||||
|
||||
private async void OnEntityChanged(string module, string id, string operation)
|
||||
{
|
||||
// Only react if it's a Layers change
|
||||
if (module.Equals("Layers", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
await InvokeAsync(async () =>
|
||||
{
|
||||
await LoadLayers();
|
||||
StateHasChanged();
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
private async Task LoadLayers()
|
||||
@@ -89,4 +106,9 @@ public partial class LayerListComponent : ComponentBase
|
||||
var url = NavigationManager.ToAbsoluteUri($"/layers/{layer.Id}").ToString();
|
||||
await JSRuntime.InvokeVoidAsync("open", url, "_blank");
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
HubService.EntityChanged -= OnEntityChanged;
|
||||
}
|
||||
}
|
||||
@@ -3,4 +3,5 @@ namespace DiunaBI.UI.Shared.Services;
|
||||
public class AppConfig
|
||||
{
|
||||
public string AppName { get; set; } = "DiunaBI";
|
||||
public string ClientLogo {get; set;} = "pedrollopl.png";
|
||||
}
|
||||
|
||||
@@ -15,16 +15,18 @@ public class AuthService
|
||||
{
|
||||
private readonly HttpClient _httpClient;
|
||||
private readonly IJSRuntime _jsRuntime;
|
||||
private readonly TokenProvider _tokenProvider;
|
||||
private bool? _isAuthenticated;
|
||||
private UserInfo? _userInfo = null;
|
||||
private string? _apiToken;
|
||||
|
||||
public event Action<bool>? AuthenticationStateChanged;
|
||||
|
||||
public AuthService(HttpClient httpClient, IJSRuntime jsRuntime)
|
||||
public AuthService(HttpClient httpClient, IJSRuntime jsRuntime, TokenProvider tokenProvider)
|
||||
{
|
||||
_httpClient = httpClient;
|
||||
_jsRuntime = jsRuntime;
|
||||
_tokenProvider = tokenProvider;
|
||||
}
|
||||
|
||||
public bool IsAuthenticated => _isAuthenticated ?? false;
|
||||
@@ -44,6 +46,7 @@ public class AuthService
|
||||
if (result != null)
|
||||
{
|
||||
_apiToken = result.Token;
|
||||
_tokenProvider.Token = result.Token; // Set token for SignalR
|
||||
_userInfo = new UserInfo
|
||||
{
|
||||
Id = result.Id,
|
||||
@@ -104,6 +107,7 @@ public class AuthService
|
||||
if (_isAuthenticated.Value && !string.IsNullOrEmpty(userInfoJson))
|
||||
{
|
||||
_apiToken = token;
|
||||
_tokenProvider.Token = token; // Set token for SignalR
|
||||
_userInfo = JsonSerializer.Deserialize<UserInfo>(userInfoJson);
|
||||
|
||||
// Restore header
|
||||
@@ -111,6 +115,9 @@ public class AuthService
|
||||
new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", _apiToken);
|
||||
|
||||
Console.WriteLine($"✅ Session restored: {_userInfo?.Email}");
|
||||
|
||||
// Notify that authentication state changed (for SignalR initialization)
|
||||
AuthenticationStateChanged?.Invoke(true);
|
||||
}
|
||||
else
|
||||
{
|
||||
@@ -139,6 +146,7 @@ public class AuthService
|
||||
await _jsRuntime.InvokeVoidAsync("localStorage.removeItem", "user_info");
|
||||
|
||||
_apiToken = null;
|
||||
_tokenProvider.Token = null; // Clear token for SignalR
|
||||
_isAuthenticated = false;
|
||||
_userInfo = null;
|
||||
|
||||
|
||||
80
DiunaBI.UI.Shared/Services/DateTimeHelper.cs
Normal file
80
DiunaBI.UI.Shared/Services/DateTimeHelper.cs
Normal file
@@ -0,0 +1,80 @@
|
||||
using Microsoft.JSInterop;
|
||||
|
||||
namespace DiunaBI.UI.Shared.Services;
|
||||
|
||||
public class DateTimeHelper
|
||||
{
|
||||
private readonly IJSRuntime _jsRuntime;
|
||||
private TimeZoneInfo? _userTimeZone;
|
||||
private bool _initialized = false;
|
||||
|
||||
public DateTimeHelper(IJSRuntime jsRuntime)
|
||||
{
|
||||
_jsRuntime = jsRuntime;
|
||||
}
|
||||
|
||||
public async Task InitializeAsync()
|
||||
{
|
||||
if (_initialized) return;
|
||||
|
||||
try
|
||||
{
|
||||
// Get the user's timezone from JavaScript
|
||||
var timeZoneId = await _jsRuntime.InvokeAsync<string>("eval", "Intl.DateTimeFormat().resolvedOptions().timeZone");
|
||||
|
||||
// Try to find the TimeZoneInfo
|
||||
try
|
||||
{
|
||||
_userTimeZone = TimeZoneInfo.FindSystemTimeZoneById(timeZoneId);
|
||||
}
|
||||
catch
|
||||
{
|
||||
// Fallback to local timezone if the IANA timezone ID is not found
|
||||
_userTimeZone = TimeZoneInfo.Local;
|
||||
}
|
||||
}
|
||||
catch
|
||||
{
|
||||
// Fallback to local timezone if JavaScript interop fails
|
||||
_userTimeZone = TimeZoneInfo.Local;
|
||||
}
|
||||
|
||||
_initialized = true;
|
||||
}
|
||||
|
||||
public string FormatDateTime(DateTime? dateTime, string format = "yyyy-MM-dd HH:mm:ss")
|
||||
{
|
||||
if (!dateTime.HasValue)
|
||||
return "-";
|
||||
|
||||
if (!_initialized)
|
||||
{
|
||||
// If not initialized yet, just format as-is (will be UTC)
|
||||
return dateTime.Value.ToString(format);
|
||||
}
|
||||
|
||||
// Convert UTC to user's timezone
|
||||
var localDateTime = TimeZoneInfo.ConvertTimeFromUtc(dateTime.Value, _userTimeZone ?? TimeZoneInfo.Local);
|
||||
return localDateTime.ToString(format);
|
||||
}
|
||||
|
||||
public string FormatDate(DateTime? dateTime, string format = "yyyy-MM-dd")
|
||||
{
|
||||
return FormatDateTime(dateTime, format);
|
||||
}
|
||||
|
||||
public string FormatTime(DateTime? dateTime, string format = "HH:mm:ss")
|
||||
{
|
||||
return FormatDateTime(dateTime, format);
|
||||
}
|
||||
|
||||
public string GetTimeZoneAbbreviation()
|
||||
{
|
||||
if (!_initialized || _userTimeZone == null)
|
||||
return "UTC";
|
||||
|
||||
return _userTimeZone.IsDaylightSavingTime(DateTime.Now)
|
||||
? _userTimeZone.DaylightName
|
||||
: _userTimeZone.StandardName;
|
||||
}
|
||||
}
|
||||
208
DiunaBI.UI.Shared/Services/EntityChangeHubService.cs
Normal file
208
DiunaBI.UI.Shared/Services/EntityChangeHubService.cs
Normal file
@@ -0,0 +1,208 @@
|
||||
using Microsoft.AspNetCore.SignalR.Client;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace DiunaBI.UI.Shared.Services;
|
||||
|
||||
public class EntityChangeHubService : IAsyncDisposable
|
||||
{
|
||||
private readonly string _hubUrl;
|
||||
private readonly ILogger<EntityChangeHubService> _logger;
|
||||
private readonly TokenProvider _tokenProvider;
|
||||
private HubConnection? _hubConnection;
|
||||
private bool _isInitialized;
|
||||
private readonly SemaphoreSlim _initializationLock = new SemaphoreSlim(1, 1);
|
||||
private static int _instanceCounter = 0;
|
||||
private readonly int _instanceId;
|
||||
|
||||
// Events that components can subscribe to
|
||||
public event Action<string, string, string>? EntityChanged;
|
||||
|
||||
public EntityChangeHubService(
|
||||
string apiBaseUrl,
|
||||
IServiceProvider serviceProvider,
|
||||
ILogger<EntityChangeHubService> logger,
|
||||
TokenProvider tokenProvider)
|
||||
{
|
||||
_instanceId = Interlocked.Increment(ref _instanceCounter);
|
||||
|
||||
// Convert HTTP URL to SignalR hub URL
|
||||
var baseUrl = apiBaseUrl.TrimEnd('/');
|
||||
_hubUrl = baseUrl + "/hubs/entitychanges";
|
||||
|
||||
_logger = logger;
|
||||
_tokenProvider = tokenProvider;
|
||||
_logger.LogInformation("🏗️ EntityChangeHubService instance #{InstanceId} created. Hub URL: {HubUrl}", _instanceId, _hubUrl);
|
||||
Console.WriteLine($"🏗️ EntityChangeHubService instance #{_instanceId} created. Hub URL: {_hubUrl}, _isInitialized = {_isInitialized}");
|
||||
}
|
||||
|
||||
public async Task InitializeAsync()
|
||||
{
|
||||
_logger.LogInformation("🔍 Instance #{InstanceId} InitializeAsync called. _isInitialized = {IsInitialized}, _hubConnection null? {IsNull}", _instanceId, _isInitialized, _hubConnection == null);
|
||||
Console.WriteLine($"🔍 Instance #{_instanceId} InitializeAsync called. _isInitialized = {_isInitialized}, _hubConnection null? {_hubConnection == null}");
|
||||
|
||||
if (_isInitialized)
|
||||
{
|
||||
_logger.LogInformation("⏭️ Instance #{InstanceId} SignalR already initialized, skipping", _instanceId);
|
||||
Console.WriteLine($"⏭️ Instance #{_instanceId} SignalR already initialized, skipping");
|
||||
return;
|
||||
}
|
||||
|
||||
await _initializationLock.WaitAsync();
|
||||
try
|
||||
{
|
||||
// Double-check after acquiring lock
|
||||
if (_isInitialized)
|
||||
{
|
||||
Console.WriteLine($"⏭️ SignalR already initialized (after lock), skipping");
|
||||
return;
|
||||
}
|
||||
|
||||
_logger.LogInformation("🔌 Initializing SignalR connection to {HubUrl}", _hubUrl);
|
||||
Console.WriteLine($"🔌 Initializing SignalR connection to {_hubUrl}");
|
||||
|
||||
_hubConnection = new HubConnectionBuilder()
|
||||
.WithUrl(_hubUrl, options =>
|
||||
{
|
||||
// Add JWT token to SignalR connection
|
||||
if (!string.IsNullOrEmpty(_tokenProvider.Token))
|
||||
{
|
||||
options.AccessTokenProvider = () => Task.FromResult<string?>(_tokenProvider.Token);
|
||||
_logger.LogInformation("✅ JWT token added to SignalR connection");
|
||||
Console.WriteLine($"✅ JWT token added to SignalR connection");
|
||||
}
|
||||
else
|
||||
{
|
||||
_logger.LogWarning("⚠️ No JWT token available for SignalR connection");
|
||||
Console.WriteLine($"⚠️ No JWT token available for SignalR connection");
|
||||
}
|
||||
})
|
||||
.WithAutomaticReconnect()
|
||||
.Build();
|
||||
|
||||
// Subscribe to EntityChanged messages
|
||||
_hubConnection.On<object>("EntityChanged", (data) =>
|
||||
{
|
||||
Console.WriteLine($"🔔 RAW SignalR message received at {DateTime.Now:HH:mm:ss.fff}");
|
||||
Console.WriteLine($"🔔 Data type: {data?.GetType().FullName}");
|
||||
|
||||
try
|
||||
{
|
||||
// Parse the anonymous object
|
||||
var json = System.Text.Json.JsonSerializer.Serialize(data);
|
||||
Console.WriteLine($"📨 Received SignalR message: {json}");
|
||||
|
||||
// Use case-insensitive deserialization (backend sends camelCase: module, id, operation)
|
||||
var options = new System.Text.Json.JsonSerializerOptions
|
||||
{
|
||||
PropertyNameCaseInsensitive = true
|
||||
};
|
||||
var change = System.Text.Json.JsonSerializer.Deserialize<EntityChangeMessage>(json, options);
|
||||
|
||||
if (change != null)
|
||||
{
|
||||
_logger.LogInformation("📨 Received entity change: {Module} {Id} {Operation}",
|
||||
change.Module, change.Id, change.Operation);
|
||||
Console.WriteLine($"📨 Entity change: {change.Module} {change.Id} {change.Operation}");
|
||||
|
||||
// Notify all subscribers
|
||||
Console.WriteLine($"🔔 Invoking EntityChanged event, subscribers: {EntityChanged?.GetInvocationList().Length ?? 0}");
|
||||
EntityChanged?.Invoke(change.Module, change.Id, change.Operation);
|
||||
Console.WriteLine($"🔔 EntityChanged event invoked successfully");
|
||||
}
|
||||
else
|
||||
{
|
||||
Console.WriteLine($"⚠️ Deserialized change is null");
|
||||
}
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "❌ Error processing entity change message");
|
||||
Console.WriteLine($"❌ Error processing message: {ex.Message}");
|
||||
Console.WriteLine($"❌ Stack trace: {ex.StackTrace}");
|
||||
}
|
||||
});
|
||||
|
||||
_hubConnection.Reconnecting += (error) =>
|
||||
{
|
||||
_logger.LogWarning("SignalR reconnecting: {Error}", error?.Message);
|
||||
Console.WriteLine($"⚠️ SignalR reconnecting: {error?.Message}");
|
||||
return Task.CompletedTask;
|
||||
};
|
||||
|
||||
_hubConnection.Reconnected += (connectionId) =>
|
||||
{
|
||||
_logger.LogInformation("✅ SignalR reconnected: {ConnectionId}", connectionId);
|
||||
Console.WriteLine($"✅ SignalR reconnected: {connectionId}");
|
||||
return Task.CompletedTask;
|
||||
};
|
||||
|
||||
_hubConnection.Closed += (error) =>
|
||||
{
|
||||
_logger.LogError(error, "❌ SignalR connection closed");
|
||||
Console.WriteLine($"❌ SignalR connection closed: {error?.Message}");
|
||||
return Task.CompletedTask;
|
||||
};
|
||||
|
||||
await StartConnectionAsync();
|
||||
_isInitialized = true;
|
||||
_logger.LogInformation("✅ Instance #{InstanceId} _isInitialized set to true", _instanceId);
|
||||
Console.WriteLine($"✅ Instance #{_instanceId} _isInitialized set to true");
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "❌ Instance #{InstanceId} Failed to initialize SignalR connection", _instanceId);
|
||||
Console.WriteLine($"❌ Instance #{_instanceId} Failed to initialize SignalR: {ex.Message}");
|
||||
}
|
||||
finally
|
||||
{
|
||||
_initializationLock.Release();
|
||||
}
|
||||
}
|
||||
|
||||
private async Task StartConnectionAsync()
|
||||
{
|
||||
if (_hubConnection == null)
|
||||
{
|
||||
_logger.LogWarning("Hub connection is null, cannot start");
|
||||
return;
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
Console.WriteLine($"🔌 Starting SignalR connection...");
|
||||
await _hubConnection.StartAsync();
|
||||
_logger.LogInformation("✅ SignalR connected successfully");
|
||||
Console.WriteLine($"✅ SignalR connected successfully to {_hubUrl}");
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "❌ Failed to start SignalR connection");
|
||||
Console.WriteLine($"❌ Failed to start SignalR: {ex.Message}\n{ex.StackTrace}");
|
||||
}
|
||||
}
|
||||
|
||||
public async ValueTask DisposeAsync()
|
||||
{
|
||||
if (_hubConnection != null)
|
||||
{
|
||||
try
|
||||
{
|
||||
await _hubConnection.StopAsync();
|
||||
await _hubConnection.DisposeAsync();
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "Error disposing SignalR connection");
|
||||
}
|
||||
}
|
||||
|
||||
_initializationLock?.Dispose();
|
||||
}
|
||||
}
|
||||
|
||||
public class EntityChangeMessage
|
||||
{
|
||||
public string Module { get; set; } = string.Empty;
|
||||
public string Id { get; set; } = string.Empty;
|
||||
public string Operation { get; set; } = string.Empty;
|
||||
}
|
||||
192
DiunaBI.UI.Shared/Services/JobService.cs
Normal file
192
DiunaBI.UI.Shared/Services/JobService.cs
Normal file
@@ -0,0 +1,192 @@
|
||||
using System.Net.Http.Json;
|
||||
using System.Text.Json;
|
||||
using DiunaBI.Application.DTOModels.Common;
|
||||
using DiunaBI.Domain.Entities;
|
||||
|
||||
namespace DiunaBI.UI.Shared.Services;
|
||||
|
||||
public class JobService
|
||||
{
|
||||
private readonly HttpClient _httpClient;
|
||||
|
||||
public JobService(HttpClient httpClient)
|
||||
{
|
||||
_httpClient = httpClient;
|
||||
}
|
||||
|
||||
private readonly JsonSerializerOptions _jsonOptions = new()
|
||||
{
|
||||
PropertyNameCaseInsensitive = true
|
||||
};
|
||||
|
||||
public async Task<PagedResult<QueueJob>> GetJobsAsync(int page = 1, int pageSize = 50, List<JobStatus>? statuses = null, JobType? jobType = null, Guid? layerId = null)
|
||||
{
|
||||
var start = (page - 1) * pageSize;
|
||||
var query = $"Jobs?start={start}&limit={pageSize}";
|
||||
|
||||
if (statuses != null && statuses.Count > 0)
|
||||
{
|
||||
foreach (var status in statuses)
|
||||
{
|
||||
query += $"&statuses={(int)status}";
|
||||
}
|
||||
}
|
||||
|
||||
if (jobType.HasValue)
|
||||
query += $"&jobType={(int)jobType.Value}";
|
||||
|
||||
if (layerId.HasValue)
|
||||
query += $"&layerId={layerId.Value}";
|
||||
|
||||
var response = await _httpClient.GetAsync(query);
|
||||
response.EnsureSuccessStatusCode();
|
||||
|
||||
var json = await response.Content.ReadAsStringAsync();
|
||||
var result = JsonSerializer.Deserialize<PagedResult<QueueJob>>(json, _jsonOptions);
|
||||
|
||||
return result ?? new PagedResult<QueueJob>();
|
||||
}
|
||||
|
||||
public async Task<QueueJob?> GetJobByIdAsync(Guid id)
|
||||
{
|
||||
var response = await _httpClient.GetAsync($"Jobs/{id}");
|
||||
|
||||
if (!response.IsSuccessStatusCode)
|
||||
return null;
|
||||
|
||||
return await response.Content.ReadFromJsonAsync<QueueJob>();
|
||||
}
|
||||
|
||||
public async Task<bool> RetryJobAsync(Guid id)
|
||||
{
|
||||
var response = await _httpClient.PostAsync($"Jobs/{id}/retry", null);
|
||||
return response.IsSuccessStatusCode;
|
||||
}
|
||||
|
||||
public async Task<bool> CancelJobAsync(Guid id)
|
||||
{
|
||||
var response = await _httpClient.DeleteAsync($"Jobs/{id}");
|
||||
return response.IsSuccessStatusCode;
|
||||
}
|
||||
|
||||
public async Task<JobStats?> GetStatsAsync()
|
||||
{
|
||||
var response = await _httpClient.GetAsync("Jobs/stats");
|
||||
|
||||
if (!response.IsSuccessStatusCode)
|
||||
return null;
|
||||
|
||||
return await response.Content.ReadFromJsonAsync<JobStats>();
|
||||
}
|
||||
|
||||
public async Task<CreateJobResult?> CreateJobForLayerAsync(Guid layerId)
|
||||
{
|
||||
var response = await _httpClient.PostAsync($"Jobs/create-for-layer/{layerId}", null);
|
||||
|
||||
if (!response.IsSuccessStatusCode)
|
||||
return null;
|
||||
|
||||
return await response.Content.ReadFromJsonAsync<CreateJobResult>();
|
||||
}
|
||||
|
||||
public async Task<(bool success, int jobsCreated, string message)> ScheduleAllJobsAsync(string? nameFilter = null)
|
||||
{
|
||||
try
|
||||
{
|
||||
var query = string.IsNullOrEmpty(nameFilter) ? "" : $"?nameFilter={Uri.EscapeDataString(nameFilter)}";
|
||||
var response = await _httpClient.PostAsync($"Jobs/ui/schedule{query}", null);
|
||||
|
||||
if (!response.IsSuccessStatusCode)
|
||||
{
|
||||
var error = await response.Content.ReadAsStringAsync();
|
||||
return (false, 0, $"Failed to schedule jobs: {error}");
|
||||
}
|
||||
|
||||
var json = await response.Content.ReadAsStringAsync();
|
||||
var result = JsonSerializer.Deserialize<JsonElement>(json, _jsonOptions);
|
||||
|
||||
var jobsCreated = result.GetProperty("jobsCreated").GetInt32();
|
||||
var message = result.GetProperty("message").GetString() ?? "Jobs scheduled";
|
||||
|
||||
return (true, jobsCreated, message);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Console.WriteLine($"Scheduling jobs failed: {ex.Message}");
|
||||
return (false, 0, $"Error: {ex.Message}");
|
||||
}
|
||||
}
|
||||
|
||||
public async Task<(bool success, int jobsCreated, string message)> ScheduleImportJobsAsync(string? nameFilter = null)
|
||||
{
|
||||
try
|
||||
{
|
||||
var query = string.IsNullOrEmpty(nameFilter) ? "" : $"?nameFilter={Uri.EscapeDataString(nameFilter)}";
|
||||
var response = await _httpClient.PostAsync($"Jobs/ui/schedule/imports{query}", null);
|
||||
|
||||
if (!response.IsSuccessStatusCode)
|
||||
{
|
||||
var error = await response.Content.ReadAsStringAsync();
|
||||
return (false, 0, $"Failed to schedule import jobs: {error}");
|
||||
}
|
||||
|
||||
var json = await response.Content.ReadAsStringAsync();
|
||||
var result = JsonSerializer.Deserialize<JsonElement>(json, _jsonOptions);
|
||||
|
||||
var jobsCreated = result.GetProperty("jobsCreated").GetInt32();
|
||||
var message = result.GetProperty("message").GetString() ?? "Import jobs scheduled";
|
||||
|
||||
return (true, jobsCreated, message);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Console.WriteLine($"Scheduling import jobs failed: {ex.Message}");
|
||||
return (false, 0, $"Error: {ex.Message}");
|
||||
}
|
||||
}
|
||||
|
||||
public async Task<(bool success, int jobsCreated, string message)> ScheduleProcessJobsAsync()
|
||||
{
|
||||
try
|
||||
{
|
||||
var response = await _httpClient.PostAsync("Jobs/ui/schedule/processes", null);
|
||||
|
||||
if (!response.IsSuccessStatusCode)
|
||||
{
|
||||
var error = await response.Content.ReadAsStringAsync();
|
||||
return (false, 0, $"Failed to schedule process jobs: {error}");
|
||||
}
|
||||
|
||||
var json = await response.Content.ReadAsStringAsync();
|
||||
var result = JsonSerializer.Deserialize<JsonElement>(json, _jsonOptions);
|
||||
|
||||
var jobsCreated = result.GetProperty("jobsCreated").GetInt32();
|
||||
var message = result.GetProperty("message").GetString() ?? "Process jobs scheduled";
|
||||
|
||||
return (true, jobsCreated, message);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Console.WriteLine($"Scheduling process jobs failed: {ex.Message}");
|
||||
return (false, 0, $"Error: {ex.Message}");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public class JobStats
|
||||
{
|
||||
public int Pending { get; set; }
|
||||
public int Running { get; set; }
|
||||
public int Completed { get; set; }
|
||||
public int Failed { get; set; }
|
||||
public int Retrying { get; set; }
|
||||
public int Total { get; set; }
|
||||
}
|
||||
|
||||
public class CreateJobResult
|
||||
{
|
||||
public bool Success { get; set; }
|
||||
public Guid JobId { get; set; }
|
||||
public string? Message { get; set; }
|
||||
public bool Existing { get; set; }
|
||||
}
|
||||
@@ -8,5 +8,7 @@
|
||||
@using Microsoft.JSInterop
|
||||
@using DiunaBI.UI.Shared
|
||||
@using DiunaBI.UI.Shared.Components
|
||||
@using DiunaBI.UI.Shared.Components.Layout
|
||||
@using DiunaBI.UI.Shared.Components.Auth
|
||||
@using DiunaBI.Application.DTOModels
|
||||
@using MudBlazor
|
||||
BIN
DiunaBI.UI.Shared/wwwroot/images/clients/morska.png
Normal file
BIN
DiunaBI.UI.Shared/wwwroot/images/clients/morska.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 15 KiB |
BIN
DiunaBI.UI.Shared/wwwroot/images/clients/pedrollopl.png
Normal file
BIN
DiunaBI.UI.Shared/wwwroot/images/clients/pedrollopl.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 40 KiB |
@@ -17,7 +17,7 @@
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<DiunaBI.UI.Shared.Components.Routes @rendermode="InteractiveServer" />
|
||||
<DiunaBI.UI.Shared.Components.Layout.Routes @rendermode="InteractiveServer" />
|
||||
|
||||
|
||||
<div id="blazor-error-ui">
|
||||
@@ -31,9 +31,24 @@
|
||||
<a class="dismiss">🗙</a>
|
||||
</div>
|
||||
|
||||
<div id="components-reconnect-modal" data-nosnippet>
|
||||
<div class="reconnect-content">
|
||||
<div class="reconnect-spinner"></div>
|
||||
<h5>Connection Lost</h5>
|
||||
<div class="reconnect-message">
|
||||
Attempting to reconnect to the server...
|
||||
</div>
|
||||
<div class="reconnect-timer">
|
||||
<span id="reconnect-elapsed-time">0s</span>
|
||||
</div>
|
||||
<button onclick="location.reload()">Reload Page</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="_framework/blazor.web.js"></script>
|
||||
<script src="_content/MudBlazor/MudBlazor.min.js"></script>
|
||||
<script src="_content/DiunaBI.UI.Shared/js/auth.js"></script>
|
||||
<script src="js/reconnect.js"></script>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
@@ -1,4 +1,5 @@
|
||||
using DiunaBI.UI.Shared;
|
||||
using DiunaBI.UI.Shared.Components.Layout;
|
||||
using DiunaBI.UI.Shared.Extensions;
|
||||
using DiunaBI.UI.Shared.Services;
|
||||
using DiunaBI.UI.Web.Components;
|
||||
@@ -16,9 +17,6 @@ builder.Services.AddSharedServices(apiBaseUrl);
|
||||
|
||||
// Configure App settings
|
||||
var appConfig = builder.Configuration.GetSection("App").Get<AppConfig>() ?? new AppConfig();
|
||||
Console.WriteLine($"[DEBUG] AppConfig.AppName from config: {appConfig.AppName}");
|
||||
Console.WriteLine($"[DEBUG] App:AppName from Configuration: {builder.Configuration["App:AppName"]}");
|
||||
Console.WriteLine($"[DEBUG] App__AppName env var: {Environment.GetEnvironmentVariable("App__AppName")}");
|
||||
builder.Services.AddSingleton(appConfig);
|
||||
|
||||
builder.Services.AddScoped<IGoogleAuthService, WebGoogleAuthService>();
|
||||
|
||||
@@ -58,3 +58,93 @@ h1:focus {
|
||||
.mud-pagination li::marker {
|
||||
display: none;
|
||||
}
|
||||
|
||||
/* Blazor Server Reconnection UI Customization */
|
||||
#components-reconnect-modal {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
right: 0;
|
||||
bottom: 0;
|
||||
background: rgba(0, 0, 0, 0.5);
|
||||
backdrop-filter: blur(4px);
|
||||
z-index: 9999;
|
||||
font-family: 'Roboto', 'Helvetica Neue', Helvetica, Arial, sans-serif;
|
||||
display: none !important;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
/* Show modal when Blazor applies these classes */
|
||||
#components-reconnect-modal.components-reconnect-show,
|
||||
#components-reconnect-modal.components-reconnect-failed,
|
||||
#components-reconnect-modal.components-reconnect-rejected {
|
||||
display: flex !important;
|
||||
}
|
||||
|
||||
#components-reconnect-modal .reconnect-content {
|
||||
background: white;
|
||||
border-radius: 8px;
|
||||
padding: 32px;
|
||||
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.2);
|
||||
max-width: 400px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
#components-reconnect-modal h5 {
|
||||
margin: 0 0 16px 0;
|
||||
color: #424242;
|
||||
font-size: 20px;
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
#components-reconnect-modal .reconnect-message {
|
||||
color: #666;
|
||||
margin-bottom: 24px;
|
||||
font-size: 14px;
|
||||
line-height: 1.5;
|
||||
}
|
||||
|
||||
#components-reconnect-modal .reconnect-spinner {
|
||||
width: 48px;
|
||||
height: 48px;
|
||||
border: 4px solid #f3f3f3;
|
||||
border-top: 4px solid #e7163d;
|
||||
border-radius: 50%;
|
||||
animation: spin 1s linear infinite;
|
||||
margin: 0 auto 16px;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
0% { transform: rotate(0deg); }
|
||||
100% { transform: rotate(360deg); }
|
||||
}
|
||||
|
||||
#components-reconnect-modal .reconnect-timer {
|
||||
color: #e7163d;
|
||||
font-size: 16px;
|
||||
font-weight: 500;
|
||||
margin-bottom: 16px;
|
||||
}
|
||||
|
||||
#components-reconnect-modal button {
|
||||
background-color: #e7163d;
|
||||
color: white;
|
||||
border: none;
|
||||
border-radius: 4px;
|
||||
padding: 10px 24px;
|
||||
font-size: 14px;
|
||||
font-weight: 500;
|
||||
cursor: pointer;
|
||||
transition: background-color 0.2s;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.5px;
|
||||
}
|
||||
|
||||
#components-reconnect-modal button:hover {
|
||||
background-color: #c01234;
|
||||
}
|
||||
|
||||
#components-reconnect-modal button:active {
|
||||
background-color: #a01028;
|
||||
}
|
||||
|
||||
82
DiunaBI.UI.Web/wwwroot/js/reconnect.js
Normal file
82
DiunaBI.UI.Web/wwwroot/js/reconnect.js
Normal file
@@ -0,0 +1,82 @@
|
||||
// Blazor Server Reconnection Timer
|
||||
(function() {
|
||||
let reconnectTimer = null;
|
||||
let startTime = null;
|
||||
|
||||
function startTimer() {
|
||||
if (reconnectTimer) return; // Already running
|
||||
|
||||
console.log('Blazor reconnection started, timer running...');
|
||||
startTime = Date.now();
|
||||
|
||||
reconnectTimer = setInterval(() => {
|
||||
const elapsedSeconds = Math.floor((Date.now() - startTime) / 1000);
|
||||
const timerElement = document.getElementById('reconnect-elapsed-time');
|
||||
|
||||
if (timerElement) {
|
||||
timerElement.textContent = `${elapsedSeconds}s`;
|
||||
}
|
||||
}, 1000);
|
||||
}
|
||||
|
||||
function stopTimer() {
|
||||
if (reconnectTimer) {
|
||||
console.log('Blazor reconnection ended, stopping timer');
|
||||
clearInterval(reconnectTimer);
|
||||
reconnectTimer = null;
|
||||
|
||||
// Reset timer display
|
||||
const timerElement = document.getElementById('reconnect-elapsed-time');
|
||||
if (timerElement) {
|
||||
timerElement.textContent = '0s';
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function checkReconnectionState() {
|
||||
const modal = document.getElementById('components-reconnect-modal');
|
||||
|
||||
if (!modal) return;
|
||||
|
||||
// Check if modal has the "show" class (Blazor applies this when reconnecting)
|
||||
if (modal.classList.contains('components-reconnect-show')) {
|
||||
startTimer();
|
||||
} else {
|
||||
stopTimer();
|
||||
}
|
||||
}
|
||||
|
||||
// MutationObserver to watch for class changes on the modal
|
||||
const observer = new MutationObserver((mutations) => {
|
||||
mutations.forEach((mutation) => {
|
||||
if (mutation.type === 'attributes' && mutation.attributeName === 'class') {
|
||||
checkReconnectionState();
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// Start observing when DOM is ready
|
||||
function init() {
|
||||
const modal = document.getElementById('components-reconnect-modal');
|
||||
|
||||
if (modal) {
|
||||
observer.observe(modal, {
|
||||
attributes: true,
|
||||
attributeFilter: ['class']
|
||||
});
|
||||
|
||||
// Check initial state
|
||||
checkReconnectionState();
|
||||
console.log('Blazor reconnection timer initialized');
|
||||
} else {
|
||||
console.warn('components-reconnect-modal not found, retrying...');
|
||||
setTimeout(init, 100);
|
||||
}
|
||||
}
|
||||
|
||||
if (document.readyState === 'loading') {
|
||||
document.addEventListener('DOMContentLoaded', init);
|
||||
} else {
|
||||
init();
|
||||
}
|
||||
})();
|
||||
Reference in New Issue
Block a user