System Architecture Overview
QuivaWorks is built on a streaming-first architecture where everything—from agent conversations to workflow executions—is represented as ordered, persistent streams of events. This foundational design makes QuivaWorks uniquely suited for conversational AI and intelligent automation.Why Streams for AI? Conversations are inherently streaming and temporal. By building on streams from the ground up, QuivaWorks naturally captures the conversational flow of AI interactions, maintains complete context history, and enables real-time processing without architectural complexity.
Stream Foundation: The Core Architecture
At the heart of QuivaWorks is an event streaming platform that treats all data as ordered sequences of immutable events. This isn’t just a storage layer—it’s the architectural foundation that powers every component.Why Streams Are Perfect for AI Agents
Conversational Flow
Natural conversation modelingAI conversations are streaming by nature—messages flow back and forth in order. QuivaWorks’ streaming architecture captures this naturally without translation layers.Each conversation turn is an event in a stream, preserving context and enabling replay.
Complete Context
Temporal state reconstructionStreams maintain complete history. Agents can “rewind” to understand past context, replay decision points, and learn from previous interactions.State isn’t stored—it’s derived from event history through aggregation.
Real-time by Default
No polling, no delaysStream subscriptions provide instant notifications. Agents respond in real-time as events occur, without polling databases or APIs.WebSocket connections stream responses as they’re generated.
Audit & Compliance
Immutable event logEvery agent decision, API call, and data access is an event. Complete audit trail is automatic, not bolted on.Regulatory compliance through built-in event sourcing.
Account-Based Multi-Tenancy
Think of Accounts as Applications, Not Users. Each account is an isolated messaging container for one application. This architectural choice simplifies security and enables clean subject namespaces.
- How It Works
- Simplified Security
- Use Cases
Subject namespace per accountEach account has its own isolated subject namespace. Messages published in Account A are completely invisible to Account B—no shared global subject space.No naming collisions: Two accounts can use identical subject names without conflict because they operate in separate namespaces.
Core Streaming Concepts
- Subjects & Routing
- Event Sourcing
- Stream Aggregation
- Replay & Time Travel
Hierarchical message routingEvery event is published to a subject (like
agents.conversation-123.message or workflows.order-flow.started). Streams listen to subject patterns and capture matching events.Subject patterns enable:- Namespace organization:
agents.>,workflows.>,data.> - Entity-specific streams:
agents.{agent_id}.>captures all events for one agent - Event filtering:
workflows.*.completedcaptures only completion events - Cross-cutting concerns:
*.errors.>captures all errors system-wide
agents.> in Account A is completely separate from agents.> in Account B.This subject-based routing is how agents find relevant context and how workflows coordinate across steps.How Streams Power AI Agents
Every aspect of agent operation is built on streams:Conversation Streams
Conversation Streams
Subject pattern:
agents.{agent_id}.conversations.{conversation_id}.>Each conversation is a stream of events:- User messages:
agents.123.conversations.abc.user-message - Agent responses:
agents.123.conversations.abc.agent-response - Tool calls:
agents.123.conversations.abc.tool-call - Context updates:
agents.123.conversations.abc.context-update
- Real-time streaming responses via WebSocket subscriptions
- Complete conversation history for context
- Replay conversations for debugging or training
- Branch conversations for A/B testing responses
Agent Memory Streams
Agent Memory Streams
Subject pattern:
agents.{agent_id}.memory.>Agent memory is a stream of memory operations:- Add memory:
{type: "add", key: "user_preference", value: "dark_mode"} - Update memory:
{key: "user_preference", value: "light_mode"} - Remove memory:
{type: "unset", path: "user_preference"}
- Automatic persistence—no database writes
- Memory history and evolution tracking
- Selective forgetting through unset operations
- Memory replay for debugging or analysis
Workflow Execution Streams
Workflow Execution Streams
Subject pattern:
workflows.{workflow_id}.executions.{execution_id}.>Workflow execution is a stream of step events:- Step started:
{step: "call_api", status: "started"} - Step completed:
{step: "call_api", status: "completed", result: {...}} - State updates:
{variable: "order_total", value: 150.00} - Errors:
{step: "payment", error: "timeout"}
- Workflow state survives crashes—resume from stream
- Complete execution audit trail
- Debug failed workflows by replaying events
- Monitor workflows in real-time via stream subscriptions
Integration Event Streams
Integration Event Streams
Subject pattern:
integrations.{integration_id}.events.>External system events flow through streams:- Webhook received:
{source: "stripe", event: "payment.succeeded"} - API call:
{endpoint: "/users", method: "GET", status: 200} - Data sync:
{table: "customers", action: "insert", record_id: "123"}
- Event-driven architecture without message brokers
- Guaranteed delivery and ordering
- Event replay for troubleshooting integrations
- Real-time integration monitoring
Application Layer
The Application Layer provides interfaces to interact with the streaming platform and build intelligent workflows.Visual Flow Builder
Stream-aware workflow design
- Drag-and-drop workflow creation that compiles to stream operations
- Real-time execution monitoring via stream subscriptions
- Visual debugging with event timeline
- Template library powered by stream patterns
REST APIs
HTTP interface to streams
- Publish events via HTTP POST
- Query stream state via GET (aggregation behind the scenes)
- WebHook subscriptions to stream subjects
- OpenAPI specification for all endpoints
WebSocket Gateway
Direct stream subscriptions
- Subscribe to subjects for real-time events
- Stream agent responses as they’re generated
- Live workflow execution monitoring
- Real-time collaboration via shared stream subscriptions
Marketplace
Stream pattern templates
- Pre-built workflow patterns (stream configurations)
- Agent templates with proven stream designs
- Community-shared integration patterns
- One-click deployment of stream architectures
Orchestration Layer
The Orchestration Layer executes workflows and manages agents, all built on stream foundations.Smart Agents Engine
- Agent Runtime
- Validation System
- Context Management
Stream-powered executionEach agent instance subscribes to relevant stream subjects and publishes its actions as events:
- Input streams: User messages, context updates, tool results
- Output streams: Responses, tool calls, memory updates
- State management: Agent state derived from event aggregation
- Context building: Automatic from conversation and memory streams
Workflow Engine
The workflow engine is a stream processor that coordinates multi-step processes by publishing and subscribing to stream subjects.Event-Driven Execution
Event-Driven Execution
Workflows execute by reacting to stream events:
- Triggers: Subscribe to subjects like
webhooks.>orschedules.>to start workflows - Step execution: Each step publishes completion event, triggering next steps
- Parallel branches: Multiple steps subscribe to same trigger event
- Conditional routing: Steps conditionally publish to different subjects based on data
- State transitions: Workflow state is the aggregation of step events
Durable Execution
Durable Execution
Workflow durability comes from streams:
- Crash recovery: Aggregate execution stream to restore state
- Exactly-once: Streams guarantee message delivery and ordering
- Checkpointing: Each completed step is an event; resume from any point
- Long-running: Workflows can pause/resume because state is in streams
Cross-Workflow Communication
Cross-Workflow Communication
Workflows communicate via streams:
- Parent-child: Parent publishes to
workflows.child-123.control.start - Data passing: Publish results to subjects child subscribes to
- Synchronization: Multiple workflows wait on same event subject
- Fan-out/fan-in: One event triggers many workflows; collect responses via subject patterns
MCP Server Architecture
MCP servers expose external systems to agents through stream interfaces: How it works:- Agent publishes tool call to
mcp.{server}.requestsubject - MCP server subscribes to request stream
- Server calls external API and publishes result to
mcp.{server}.responsesubject - Agent subscribes to response stream and receives result
- All interactions are events in streams for audit and replay
- Asynchronous tool calls—agent doesn’t block
- Automatic retry via stream redelivery
- Complete tool call history in stream
- Easy to add caching, rate limiting via stream processors
Infrastructure Layer
The Infrastructure Layer provides the streaming platform and supporting services.QuivaWorks Streaming Platform
- Stream Storage
- Subject Routing
- Stream Aggregation
- Stream Search
Persistent event logs
- File-based storage: Optimized for sequential writes and reads
- Memory option: In-memory streams for temporary data
- Retention policies: Automatic cleanup based on age, size, or message count
- Compression: Reduce storage costs while maintaining fast access
- Replication: Multi-replica streams for high availability
Compute & Functions
- Serverless Functions
- Container Orchestration
Stream-triggered computeFunctions subscribe to stream subjects and execute when events arrive:
- Event handlers: Process events from streams
- Stream transformations: Convert events between formats
- Aggregation functions: Custom folding logic
- Side effects: Call external APIs, send notifications
Security Architecture
QuivaWorks implements security at the stream level, with account isolation providing the foundation for multi-tenant security.
Account Isolation
Account Isolation
Infrastructure-level tenant separationThis architectural isolation eliminates entire classes of multi-tenant security vulnerabilities.
- Subject namespace isolation: Each account has completely separate subject space
- Zero-configuration: No ACL rules needed for tenant separation
- Impossible to breach: Cannot publish/subscribe across accounts—enforced at infrastructure level
- User scoping: Users belong to accounts; credentials are account-specific
Subject-Based Access Control
Subject-Based Access Control
Fine-grained permissions within accountsWithin an account, subject-based permissions control access:
- Publish permissions: Control who can publish to which subjects
- Subscribe permissions: Control who can read from which subjects
- Subject patterns: Grant access using wildcards like
agents.{user_id}.> - Dynamic ACLs: Update permissions without system restart
agents.{their_id}.> but not other users’ agent streams.Stream Encryption
Stream Encryption
End-to-end protection
- In-transit: TLS 1.3 for all stream communication
- At-rest: Encrypted stream storage with key rotation
- Per-account keys: Different encryption keys per account
- Key management: Integration with KMS systems
Audit Streams
Audit Streams
Built-in compliance
- Access logging: All stream accesses recorded as events
- Change tracking: Every data modification is an event with timestamp and actor
- Immutable logs: Can’t delete or modify past events
- Compliance reporting: Query audit streams for regulatory reports
Stream-First Benefits for AI
The streaming architecture provides unique advantages for AI agent systems:Natural Conversation Modeling
Conversations map directly to event streams. No impedance mismatch between how AI works (sequential, contextual) and how data is stored.
Automatic Context Management
Agent context is built by aggregating relevant streams. Add knowledge by publishing events—no manual context management.
Real-time Everything
Stream subscriptions enable real-time agent responses, workflow monitoring, and event reactions without polling or websocket complexity.
Infinite Scalability
Stateless agents and workflows scale horizontally. Add instances that subscribe to same streams—load balancing is automatic.
Perfect Audit Trail
Every agent action is an immutable event. Compliance and debugging are built-in, not added later.
Time Travel Debugging
Replay event streams to understand “why did the agent do that?” Debug by aggregating history up to the problem point.
Zero-Config Multi-Tenancy
Account isolation provides secure multi-tenancy without complex ACLs or tenant ID patterns in subjects.
Resilience & Recovery
Crashes don’t lose data—streams persist. Agents and workflows resume by aggregating streams to restore state.
Deployment Models
QuivaCloud
Managed streaming platform
- Global stream infrastructure
- Automatic scaling and replication
- 99.9% uptime SLA
- Pay-per-event pricing
Private Cloud
Dedicated stream clusters
- Isolated stream infrastructure
- Custom retention and replication
- Enhanced SLAs
- Dedicated support
Hybrid/On-Premise
Self-hosted streams
- Deploy streaming platform on your infrastructure
- Full control over data locality
- Custom stream configurations
- Enterprise support included
Performance & Scaling
Stream-First Performance Characteristics
Sub-millisecond Latency
Events published to streams are delivered to subscribers in under 1ms within the same region.
Millions of Events/Second
Distributed stream architecture handles millions of events per second with linear scaling.
Horizontal Scaling
Add stream partitions and subscribers to scale throughput without limits.
Efficient Storage
Compressed, append-only logs use minimal storage while enabling fast queries.
Auto-Scaling
QuivaWorks automatically scales stream infrastructure:- Stream partitioning: High-volume subjects automatically partition across servers
- Replica scaling: Add replicas as subscriber count increases
- Compute scaling: Add agent instances and function workers based on event backlog
- Storage scaling: Automatically provision storage as stream size grows
Monitoring & Observability
Comprehensive monitoring is built into the streaming platform.- Real-time Metrics
- Logging & Tracing
- Business Intelligence
- Stream throughput and latency per account
- Agent performance and quality metrics
- Workflow execution times and success rates
- Subject-level publish/subscribe rates