Pattern selection guide
Choose the right pattern based on how your source system delivers data:| Source system delivers… | Integration pattern | Use when |
|---|---|---|
| Inbound API calls (webhooks) | Vendor webhooks | Payment processors, fraud systems, third-party services send notifications |
| Messages to a broker | Event streaming | Core banking, payment systems publish to Kafka, JMS, or similar |
| Files via SFTP/FTP | File processing | Batch transaction files, statements, or regulatory reports |
| Minimal notifications | Event enrichment | Systems send IDs only, requiring API calls for full data |
| Events for multiple systems | Multi-consumer | One event must reach digital channels, CRM, analytics, and more |
Patterns work togetherThese patterns complement each other. A single integration might use file processing for batch data, webhooks for real-time updates, and multi-consumer distribution to reach all downstream systems.
1. Vendor webhooks
Receive real-time notifications from external services through secure HTTP endpoints.Pattern overview
Payment vendors, fraud detection systems, and third-party services often require your system to expose HTTP endpoints (webhooks) where they can push real-time notifications.When to use
- Payment processors send status updates
- Fraud systems push alerts
- External services notify of events
- Real-time notifications required
Benefits
- Immediate notification delivery
- No polling overhead
- Real-time customer updates
- Reduced API call costs
How it works
Webhook registration
Your connector exposes a secure HTTPS endpoint where the vendor can send notifications. Register this endpoint with the vendor’s system.
Receive notification
Vendor sends HTTP POST request to your webhook endpoint with event payload (payment status, fraud alert, etc.).
Validate and process
Your connector validates the webhook signature, authenticates the request, and processes the payload.
Implementation example
Scenario: Alacriti payment processor sends payment status updates via webhook Configuration approach:- Build a webhook connector using Grand Central’s connector framework
- Implement signature validation for security
- Transform Alacriti’s payload to your canonical payment event format
- Configure routing to digital banking, CRM, and analytics systems
Common use cases
Typical use cases for webhook integration include:- Payment status updates - real-time payment processing status from Alacriti/Volante. Track payment lifecycle from initiated → sent → processed → completed. Includes failed payment notifications with reason codes, settlement confirmations, and chargeback notifications. Business value: Immediate customer notification and reduced support inquiries.
- Fraud detection alerts - real-time fraud alerts from security vendors. Receive transaction flags for review, account suspension recommendations, suspicious activity pattern detection, and velocity threshold breaches. Business value: Immediate fraud prevention and reduced losses.
- Account verification - third-party KYC/AML verification results. Get identity verification completion, document validation results, risk scoring updates, and compliance status changes. Business value: Faster onboarding and automated compliance.
2. Event streaming
Consume events from message brokers like Kafka, JMS, or Azure Service Bus.Pattern overview
Core banking systems, payment processors, and enterprise applications often publish events to message brokers like Kafka, JMS, or Azure Service Bus. Your connectors consume these events and distribute them through Sync Hub.When to use
- Core banking publishes to Kafka
- Payment systems use JMS queues
- Vendor provides message broker access
- High-volume event streams
Benefits
- Scalable event processing
- Replay capabilities
- Decoupled architecture
- Built-in fault tolerance
How it works
Connect to broker
Your connector establishes connection to the vendor’s message broker (Kafka, JMS, Azure Service Bus, etc.).
Subscribe to topics
Subscribe to relevant topics/queues containing the events your systems need (transactions, account updates, etc.).
Consume and transform
Consume messages from the broker, validate schemas, and transform to your canonical format.
Implementation example
Scenario: Core banking publishes account transaction events to Kafka Configuration approach:- Build an event consumer connector using Grand Central’s framework
- Configure Kafka/JMS connection details and authentication
- Implement schema validation for incoming messages
- Transform to canonical transaction event format
- Configure dead letter handling for processing failures
Common use cases
Typical use cases for event streaming include:- Transaction events - core banking publishes all account transactions to Kafka. Includes deposits and withdrawals, ATM transactions, point-of-sale purchases, wire transfers, and ACH. Business value: Real-time transaction visibility across all channels.
- Account lifecycle events - account management system publishes state changes. Track account opened, status changed (active, dormant, closed), account holder information updated, and service agreements modified. Business value: Consistent account data across all systems.
- Payment lifecycle events - payment processing system publishes payment state changes. Follow payment order created, payment approved/rejected, payment sent to processor, and payment completed/failed. Business value: Complete payment tracking and audit trail.
3. File processing
Process batch files from SFTP, S3, or Azure Blob Storage into event streams.Pattern overview
Many banking systems still exchange data via files - batch transaction files, account statements, regulatory reports. Sync Hub processes these files, breaks them into individual events, and distributes to consuming systems.When to use
- Daily transaction batch files
- Account statement generation
- Regulatory reporting files
- Legacy system integrations
Benefits
- Handles large data volumes
- Automated validation
- Error recovery mechanisms
- Audit trail for compliance
How it works
File detection
Your file processor connector detects new files using polling, file system events, or cloud notifications.
Parse and validate
Parse file format (CSV, XML, fixed-width, etc.), validate structure, and extract individual records.
Event generation
Create individual events for each record in the file, transform to canonical format.
Implementation example
Scenario: Core banking sends daily transaction file via SFTP Configuration approach:- Build a file processor connector using Grand Central’s framework
- Configure SFTP/FTP connection or cloud storage monitoring
- Implement file format parsing (CSV, XML, fixed-width, etc.)
- Add validation rules for file structure and content
- Configure error handling for malformed files
Common use cases
Typical use cases for file processing include:- Daily transaction files - core banking sends overnight transaction file containing all transactions for the previous day. Supports batch processing of large volumes with validated and reconciled data, handling multiple transaction types in a single file. Business value: Bulk data synchronization with reduced API calls.
- Account statement generation - monthly account statements as structured files. Includes account balances, transaction history, interest calculations, and fee assessments. Business value: Automated statement delivery to digital channels.
- Batch payment files - batch payment instructions from corporate customers. Contains multiple payment orders in file format with validation and approval workflow, processing and status tracking, and results file generation. Business value: Efficient bulk payment processing.
4. Event enrichment
Enhance events with additional data from APIs, databases, or caches.Pattern overview
Some systems send lightweight notifications containing only identifiers to reduce bandwidth. Your connector must make API calls to retrieve full data, enrich it with additional context, then publish complete events.When to use
- Vendors send minimal notifications
- Full data requires API calls
- Additional context needed
- Data aggregation from multiple sources
Benefits
- Reduced notification bandwidth
- Enriched data for consumers
- Consistent data format
- Single point of enrichment
How it works
Receive notification
Vendor sends lightweight notification with just identifiers (transaction ID, customer ID, etc.).
Data aggregation
Combine data from multiple API calls, add contextual information, apply business rules.
Implementation example
Scenario: Payment processor sends transaction ID, connector enriches with full payment details Configuration approach:- Build enrichment connector using Grand Central’s framework
- Configure API endpoints and authentication
- Implement orchestration logic for multiple API calls
- Add caching to reduce API load
- Handle API failures gracefully with retry logic
Common use cases
Typical use cases for enrichment include:- Transaction notifications - fraud system sends transaction ID for flagged transaction. Enrichment retrieves transaction details from payment system, fetches customer profile and history, gets account information and limits, and adds risk scoring data. Business value: Complete context for fraud investigation.
- Customer update notifications - vendor sends customer ID for profile update. Enrichment fetches complete customer profile, retrieves account associations, gets preference settings, and adds segmentation data. Business value: Full customer view for all consuming systems.
- Payment status updates - processor sends payment reference for status change. Enrichment retrieves payment order details, fetches beneficiary information, gets routing and settlement data, and adds fee calculations. Business value: Complete payment information for notifications.
5. Multi-consumer distribution
Distribute events to multiple consuming systems with independent delivery guarantees.Pattern overview
A single data event often needs to reach multiple consuming systems - digital banking channels, CRM, analytics, compliance, and more. Sync Hub’s multi-consumer pattern ensures each system receives the event reliably without creating point-to-point integrations.When to use
- Multiple systems need same data
- Each consumer has different needs
- Avoid point-to-point integration complexity
- Independent consumer scaling
Benefits
- Single publish, multiple deliveries
- Reduced integration complexity
- Independent consumer evolution
- Centralized monitoring
How it works
Single event published
Producer connector publishes one event to Sync Hub (transaction, customer update, payment status, etc.).
Fan-out distribution
Sync Hub automatically distributes the event to all configured consumers simultaneously.
Consumer-specific processing
Each consumer receives the event and processes it according to their needs - different transformations, filtering, timing.
Implementation example
Scenario: Single transaction event distributed to four consuming systems Configuration approach:- Producer publishes transaction event once to Sync Hub
- Configure four consumer connectors, each subscribed to transaction events
- Each consumer implements its own transformation logic
- Each consumer has independent retry and error handling
- Monitor each consumer’s health separately
Common use cases
Typical use cases for multi-consumer distribution include:- Transaction distribution - ATM withdrawal event reaches multiple consumers simultaneously. Digital banking displays in transaction history, CRM updates customer interaction timeline, fraud detection analyzes for suspicious patterns, analytics adds to data warehouse for reporting, and compliance maintains audit trail for regulatory reporting. Business value: Single event drives comprehensive system updates.
- Customer profile updates - customer changes address in branch system. Digital banking updates profile display, CRM updates customer record, marketing updates segmentation data, compliance updates KYC record, and document management updates mailing address. Business value: Consistent customer data across all systems.
- Payment status changes - payment processor updates payment status. Digital banking sends customer notification, CRM updates payment history, accounting creates journal entry, reconciliation tracks settlement, and reporting captures payment analytics. Business value: Complete payment lifecycle tracking.
Pattern combinations
Real-world integrations often combine multiple patterns for comprehensive data synchronization. The following example shows how patterns work together.Example: Complete payment processing integration
Patterns used:- File processing: Daily settlement file with batch transactions
- Webhooks: Real-time payment status notifications
- Event streaming: Transaction lifecycle events from payment system
- Enrichment: Add customer and account context
- Multi-consumer: Distribute to all consuming systems
Choosing the right pattern
Use this decision tree to select the appropriate integration pattern:How does the source system deliver data?
- HTTP POST (webhook) → Use vendor webhooks pattern
- Message broker → Use event streaming pattern
- File upload → Use file processing pattern
- Notification with ID → Use event enrichment pattern
Do multiple systems need this data?
- Yes → Add multi-consumer distribution
- No → Direct point-to-point may be sufficient
Is data complete in the notification?
- No, need API calls → Add event enrichment
- Yes → Proceed with base pattern