The Nexus Engine Architecture
High-Availability Hybrid Cloud System Design
graph TD
%% Styles
classDef aws fill:#FF9900,stroke:#232F3E,stroke-width:2px,color:white;
classDef n8n fill:#EA4B71,stroke:#333,stroke-width:2px,color:white;
classDef ai fill:#10a37f,stroke:#333,stroke-width:2px,color:white;
classDef db fill:#336791,stroke:#333,stroke-width:2px,color:white;
classDef user fill:#6366f1,stroke:#333,stroke-width:2px,color:white;
classDef render fill:#f59e0b,stroke:#333,stroke-width:2px,color:white;
subgraph Client_Layer [Client Interaction]
Dashboard[User Dashboard / Next.js]:::user
UploadPortal[Secure Upload Portal]:::user
Approver[Mobile Approval Interface]:::user
end
subgraph Orchestration [Orchestration Layer]
N8N[N8N Gateway & Logic Router]:::n8n
end
subgraph AWS_Cloud [AWS Infrastructure]
S3_Raw[S3: Raw Assets]:::aws
S3_Final[S3: Rendered Video]:::aws
Lambda_Vision[Lambda: Computer Vision]:::aws
Lambda_Sync[Lambda: Beat Sync / Librosa]:::aws
SQS[Message Queue]:::aws
end
subgraph Data_Layer [Data & Memory]
Postgres[(PostgreSQL: User/Biz Data)]:::db
Pinecone[(Pinecone: Vector/Brand Memory)]:::db
end
subgraph AI_Services [External AI Intelligence]
GPT4[OpenAI GPT-4o: Scripting]:::ai
Banana[Banana.dev: Auto-Tagging]:::ai
Gemini[Nano Banana: Fallback Gen]:::ai
Runway[Runway: Img-to-Video]:::ai
end
subgraph Factory [Rendering Factory]
Shotstack[Shotstack API]:::render
end
subgraph Distribution [Distribution]
Ayrshare[Ayrshare API: Socials]:::render
GoogleBiz[Google Biz Profile: Maps]:::render
end
%% Flow 1: Ingestion
UploadPortal -- "1. Upload 4K Video (Presigned URL)" --> S3_Raw
S3_Raw -- "2. Trigger Event" --> Lambda_Vision
Lambda_Vision -- "3. Analyze Frames" --> Banana
Banana -- "4. Return Tags [Sunset, Oysters]" --> Pinecone
%% Flow 2: Trigger & Logic
Review_Webhook(Google Review Webhook) --> N8N
N8N -- "5. Fetch Context" --> Postgres
N8N -- "6. Semantic Search" --> Pinecone
Pinecone -- "7. Return Best Asset IDs" --> N8N
N8N -- "8. Generate JSON Script" --> GPT4
%% Flow 3: Fallback Logic
N8N -- "9. If Asset Missing" --> Gemini
Gemini -- "10. Gen Static Img" --> Runway
Runway -- "11. Return Video URL" --> N8N
%% Flow 4: Assembly
N8N -- "12. Send Audio + Clips" --> Lambda_Sync
Lambda_Sync -- "13. Return Beat Timestamps" --> N8N
N8N -- "14. Send Final JSON" --> Shotstack
Shotstack -- "15. Render 4K" --> S3_Final
%% Flow 5: Delivery
S3_Final -- "16. Notify User" --> Approver
Approver -- "17. Approve" --> N8N
N8N -- "18. Publish" --> Ayrshare
N8N -- "19. Update Maps" --> GoogleBiz
1. The Ingestion Pipeline
Goal: Turn dumb files into smart data.
Files are uploaded directly to AWS S3 (bypassing the server). This triggers a Computer Vision worker (Banana.dev) that tags the footage (e.g., "Luxury," "Oysters") and stores the embeddings in Pinecone.
2. The Nexus Brain
Goal: Context-aware scripting.
N8N acts as the router. It pulls the Client's "Brand Voice" from the vector database and uses GPT-4o to write a script. If an asset is missing, the Fallback Protocol triggers (Gemini + Runway) to generate the missing shot instantly.
3. The Video Factory
Goal: Broadcast quality, rendered in the cloud.
We use a Python Microservice (Librosa) to analyze the music track and find the "Downbeats." Shotstack uses these timestamps to cut the video exactly to the music rhythm using the 4K assets stored in S3.
4. Distribution
Goal: Omnipresence.
Once approved via the Mobile UI, the system pushes the video to Ayrshare (for IG/TikTok) and the Google Business Profile API (for Search/Maps), injecting SEO metadata into the upload.