The Complete Guide to Composable Fintech Architecture: Building Scalable Banking Systems

January 12, 202635 min read

By Matt Anderton, Chief Technology Officer, Chisel | 18 min read


Table of Contents

  1. Introduction

  2. Understanding Composable Architecture

  3. The Modular Banking Stack

  4. API-First Architecture

  5. Microservices Architecture Patterns

  6. Database Architecture and Data Management

  7. Integration Patterns

  8. Developer Experience and Velocity

  9. Security and Compliance by Design

  10. Scalability and Performance

  11. Migration Strategies

  12. Building for the Future

  13. Frequently Asked Questions

  14. Related Resources

  15. About the Author


Introduction

The monolithic approach to fintech infrastructure is dead. Recent industry challenges have proven that rigid, tightly-coupled systems cannot adapt fast enough for modern financial services. When middleware providers face constraints or legacy system limitations, every company dependent on their architecture faces the same bottlenecks.

Traditional banking technology was built for a world of branches and batch processing—mainframes that updated overnight, paper-based processes, and quarterly feature releases. Modern fintech demands real-time, API-first, composable fintech architecture that can evolve without complete rebuilds.

I've architected systems at BluBox and dozens of other fintechs over the past decade. The biggest mistake I see companies make is treating infrastructure as a single vendor decision rather than an architecture decision. Composability isn't just a buzzword—it's the only sustainable approach to financial technology.

Recent industry challenges have exposed the limitations of monolithic middleware approaches. When one component fails, everything fails. When one vendor makes a strategic pivot, all their customers are forced to follow. When regulatory requirements change, everyone waits for the same vendor to update their single, shared system.

What This Guide Covers:
- Composable fintech architecture principles that enable rapid iteration
- Modular banking stack design and component selection frameworks
- API-first architecture patterns and fintech API design best practices
- Microservices fintech patterns vs. monolith decision frameworks
- Developer experience optimization and velocity multiplication
- Integration patterns for processors, banks, and third-party services
- Testing, deployment, and operational strategies that scale

My Promise: By the end of this guide, you'll understand how to build fintech infrastructure that scales with your business, adapts to regulatory changes, and accelerates product development instead of constraining it.

The future belongs to teams that understand architecture is strategy. Let's build systems that enable rather than restrict innovation.


Understanding Composable Architecture

What Is Composable Fintech Architecture?

Composable fintech architecture is an approach to building financial systems using interchangeable, modular components that communicate through well-defined APIs. Instead of relying on a single, monolithic platform, you assemble best-of-breed components that can be independently developed, deployed, and replaced.

Think of it like building with specialized tools rather than buying a single multi-tool. Each component excels at its specific function—core banking, payment processing, compliance monitoring, or customer communications—while connecting seamlessly with other components through standardized interfaces.

The Composable Principle: Every component should be replaceable without requiring changes to other components. Your payment processor should be swappable without touching your core banking system. Your compliance monitoring should be upgradeable without affecting customer communications.

The Problems with Monolithic Banking Systems

Traditional banking systems and many BaaS middleware providers follow a monolithic architecture model that creates several critical constraints:

Single Point of Failure:
When one component experiences issues, the entire system is affected. A problem with payment processing can impact account management, customer communications, and reporting.

Vendor Lock-In:
All functionality is bundled together, making it expensive and complex to replace any single component. You're locked into their technology choices, upgrade timeline, and strategic direction.

Limited Customization:
Monolithic systems are designed for the average use case. Custom features require vendor development resources, which may not align with your priorities or timeline.

Scaling Constraints:
You can't scale individual components based on demand. If payment processing is your bottleneck, you must scale the entire system, not just the payment component.

Development Dependencies:
All features must be developed within the constraints of the existing system. New capabilities are limited by legacy architectural decisions made years earlier.

Core Principles of Composability

1. Modularity
Each component serves a specific, well-defined function. Components have clear boundaries and responsibilities, making them easier to understand, maintain, and replace.

2. Interoperability
Components communicate through standardized APIs and data formats. Any component can be replaced with an alternative that implements the same interface contracts.

3. Replaceability
Components can be swapped without affecting other parts of the system. This enables continuous optimization, vendor diversification, and technology evolution.

4. Scalability
Individual components can be scaled independently based on demand. Payment processing, compliance monitoring, and data analytics can each scale according to their specific performance requirements.

Why Composability Matters in Financial Services

Financial services have unique requirements that make composability particularly valuable:

Regulatory Agility:
Regulations change frequently and vary by geography. Composable systems allow you to update compliance components without rebuilding your entire platform.

Product Velocity:
Financial products evolve rapidly based on customer needs and competitive pressure. Composable architecture enables faster feature development and experimentation.

Risk Management:
Financial systems require exceptional reliability. Component isolation means failures are contained, and redundancy can be built at the component level.

Audit and Compliance:
Regulators need to understand your systems. Modular components with clear interfaces are easier to document, audit, and explain to regulatory bodies.

Technology Evolution:
Financial technology advances quickly. Composable systems let you adopt new capabilities—better fraud detection, improved analytics, enhanced security—without platform migrations.

The Shift from Integrated to Composable

The financial services industry is experiencing a fundamental shift from integrated platforms to composable architectures:

Traditional Integrated Model:
- Single vendor provides all functionality
- Deep integration creates dependencies
- Customization requires vendor cooperation
- Scaling means scaling everything
- Innovation happens at vendor's pace

Modern Composable Model:
- Best-of-breed components for each function
- Standardized APIs enable loose coupling
- Custom development happens independently
- Components scale based on individual demand
- Innovation happens at your pace

Real-World Architecture Comparison

Monolithic BaaS Architecture:

Customer Request → BaaS Platform → [Black Box with bundled: Core Banking + Payments + Compliance + Analytics] → Response

Problems: Single point of failure, vendor lock-in, limited visibility, constrained customization

Composable Architecture:

Customer Request → API Gateway → [Core Banking] ← → [Payment Processing] ← → [Compliance Engine] ← → [Analytics Platform] → Response

Benefits: Component isolation, vendor choice, custom optimization, independent scaling

The Result:
Composable architectures typically achieve 3-5x faster feature development, 50-70% better system reliability, and 40-60% lower long-term costs compared to monolithic alternatives.


The Modular Banking Stack

Essential Components of Modern Fintech

A modular banking stack consists of specialized components that handle specific aspects of financial services. Each layer can be optimized independently while maintaining clean interfaces with other layers.

Core Banking Layer

The foundation of any financial system, responsible for the fundamental operations of money management:

Ledger Systems:
- Double-entry bookkeeping and transaction recording
- Real-time balance calculations and account reconciliation
- Historical transaction tracking and audit trails
- Multi-currency support and foreign exchange handling

Account Management:
- Customer account lifecycle (creation, maintenance, closure)
- Account type configuration (checking, savings, credit, etc.)
- Interest calculation and fee assessment
- Account hierarchy and relationship management

Balance Tracking:
- Real-time available balance calculations
- Pending transaction management
- Overdraft and credit limit enforcement
- Balance history and reconciliation processes

Transaction Posting:
- Atomic transaction processing with ACID compliance
- Transaction validation and authorization logic
- Batch and real-time posting capabilities
- Error handling and transaction reversal processes

Leading Core Banking Platforms:
- Thought Machine Vault: Cloud-native, API-first, highly configurable
- Mambu: SaaS-based, rapid deployment, strong developer experience
- Technisys: Full-stack platform with extensive customization options

Payment Processing Layer

Handles all forms of money movement, from card transactions to wire transfers:

Card Issuing and Processing:
- Virtual and physical card provisioning
- Authorization and settlement processing
- Spending controls and transaction limits
- Card lifecycle management (activation, blocking, replacement)

ACH/Wire Capabilities:
- Automated Clearing House transaction processing
- Wire transfer initiation and receipt
- Direct deposit and bill pay functionality
- Return processing and exception handling

Real-Time Payments:
- FedNow and RTP network connectivity
- Instant payment processing and confirmation
- Request for payment and payment messaging
- Fraud monitoring for real-time transactions

Settlement Management:
- Daily settlement processes and reconciliation
- Multi-processor settlement aggregation
- Exception handling and dispute management
- Regulatory reporting and compliance tracking

Identity & Compliance Layer

Ensures regulatory compliance and manages customer risk:

KYC/KYB Verification:
- Identity document verification and authentication
- Biometric matching and liveness detection
- Business verification and beneficial ownership identification
- Ongoing customer due diligence processes

AML Transaction Monitoring:
- Real-time transaction screening and scoring
- Pattern recognition and anomaly detection
- Alert generation and case management
- Suspicious Activity Report (SAR) preparation and filing

Sanctions Screening:
- OFAC and global sanctions list screening
- Real-time transaction interdiction
- Customer and counterparty screening
- Ongoing monitoring and list updates

Risk Scoring:
- Customer risk assessment and profiling
- Transaction risk scoring and decisioning
- Machine learning-based fraud detection
- Portfolio risk monitoring and reporting

Data & Analytics Layer

Transforms transaction data into business intelligence:

Transaction Databases:
- High-performance transaction storage and indexing
- Real-time and historical data access
- Data retention and archival strategies
- Backup and disaster recovery processes

Customer Analytics:
- Spending pattern analysis and categorization
- Customer segmentation and profiling
- Product usage analytics and insights
- Customer lifetime value calculation

Reporting Infrastructure:
- Regulatory reporting automation
- Management dashboard and KPI tracking
- Custom report generation and scheduling
- Data export and API access for third parties

Business Intelligence:
- Predictive analytics and machine learning models
- Customer behavior insights and recommendations
- Product performance analysis and optimization
- Market trend analysis and competitive intelligence

Customer Experience Layer

Manages all customer-facing interactions and communications:

Mobile and Web Applications:
- Native mobile apps and responsive web interfaces
- Account management and transaction history
- Payment initiation and transfer capabilities
- Security settings and notification preferences

Customer Communications:
- Email and SMS notification systems
- Statement generation and delivery
- Marketing communication and personalization
- Customer service and support tools

Support and Servicing Tools:
- Customer service representative dashboards
- Account inquiry and transaction research tools
- Dispute initiation and tracking systems
- Document management and customer record keeping

Component Selection Framework

Build vs. Buy Decision Matrix:

ComponentBuild WhenBuy WhenCore BankingUnique business model requirementsStandard banking operationsPayment ProcessingCustom authorization logic neededStandard payment flows sufficientComplianceSpecialized risk models requiredStandard AML/KYC requirementsAnalyticsProprietary algorithms provide advantageStandard reporting meets needsCustomer AppsDifferentiated UX is core value propBasic functionality is sufficient

Evaluation Criteria:
1. Strategic Importance: How critical is this component to your competitive advantage?
2. Complexity: How difficult would it be to build and maintain internally?
3. Regulatory Requirements: Are there specific compliance needs that require customization?
4. Integration Requirements: How tightly does this need to integrate with other components?
5. Total Cost of Ownership: What are the full costs of building vs. buying over 5 years?

Build vs. Buy Decision Matrix

The Modern Approach: Most successful fintechs buy commodity components (standard payment processing, basic compliance tools) and build differentiating components (unique customer experiences, proprietary risk models, specialized analytics).

Key Insight: The goal isn't to minimize building—it's to build the right things. Build what differentiates you competitively. Buy everything else.


API-First Architecture

What Is API-First Design?

API-First design means designing your Application Programming Interfaces before implementing the underlying systems. Every component, service, and integration is designed around clean, well-documented APIs that serve as contracts between different parts of your system.

In fintech API design, this approach is particularly critical because financial data and operations must be precise, auditable, and secure. APIs become the foundation for not just internal communication, but external integrations, regulatory reporting, and customer-facing applications.

API-First Benefits:
- Parallel Development: Frontend and backend teams can work simultaneously
- Integration Flexibility: Easy to integrate with third-party services and partners
- Testing and Quality: APIs can be tested independently and comprehensively
- Documentation: Forces clear specification of system behavior and contracts

RESTful API Best Practices for Fintech

Resource-Based URLs:

GET /accounts/{accountId}/transactions
POST /payments/transfers
GET /customers/{customerId}/statements
PUT /cards/{cardId}/controls

HTTP Status Codes for Financial Operations:
- 200 OK - Successful retrieval or update
- 201 Created - Successful account or transaction creation
- 400 Bad Request - Invalid request format or parameters
- 401 Unauthorized - Authentication required
- 403 Forbidden - Insufficient permissions for operation
- 404 Not Found - Account, transaction, or resource doesn't exist
- 409 Conflict - Duplicate transaction or resource conflict
- 422 Unprocessable Entity - Valid format but business rule violation
- 429 Too Many Requests - Rate limit exceeded
- 500 Internal Server Error - System error requiring investigation

Idempotency for Financial Operations:

POST /payments/transfers
{
  "idempotencyKey": "transfer_20260115_001",
  "fromAccount": "acc_123456",
  "toAccount": "acc_789012",
  "amount": {
    "value": "100.00",
    "currency": "USD"
  },
  "description": "Monthly rent payment"
}

Consistent Error Responses:

{
  "error": {
    "code": "INSUFFICIENT_FUNDS",
    "message": "Account balance insufficient for requested transfer",
    "details": {
      "accountId": "acc_123456",
      "requestedAmount": "100.00",
      "availableBalance": "75.50"
    }
  }
}

GraphQL vs. REST: When to Use Each

Use REST for:
- Standard CRUD operations (account management, transaction posting)
- Simple request/response patterns
- Operations that map well to HTTP verbs
- External API interfaces where REST is expected
- Caching scenarios where HTTP caching is beneficial

Use GraphQL for:
- Complex data fetching across multiple entities
- Customer-facing applications needing flexible data queries
- Internal admin dashboards with varying data requirements
- Real-time subscriptions for account updates or notifications
- Scenarios where over-fetching or under-fetching is a performance concern

Example GraphQL Query for Customer Dashboard:

query CustomerDashboard($customerId: ID!) {
  customer(id: $customerId) {
    name
    accounts {
      id
      type
      balance {
        available
        current
      }
      recentTransactions(limit: 5) {
        id
        amount
        description
        date
      }
    }
    cards {
      id
      last4
      status
      expirationDate
    }
  }
}

Webhook Architecture for Real-Time Events

Banking Infrastructure APIs must support real-time notifications for critical events like transaction authorizations, fraud alerts, and account status changes.

Webhook Event Types:

{
  "eventType": "transaction.authorized",
  "timestamp": "2026-01-15T10:30:00Z",
  "data": {
    "transactionId": "txn_987654321",
    "accountId": "acc_123456",
    "amount": {
      "value": "25.99",
      "currency": "USD"
    },
    "merchant": {
      "name": "Coffee Shop",
      "category": "restaurant"
    },
    "authorizationCode": "AUTH123456"
  }
}

Webhook Reliability Patterns:
- Retry Logic: Exponential backoff with maximum retry attempts
- Signature Verification: HMAC signatures to verify webhook authenticity
- Idempotency: Event IDs to handle duplicate deliveries
- Ordering: Sequence numbers for events that must be processed in order

API Versioning Strategies

URL Versioning (Recommended for Financial APIs):

https://api.yourfintech.com/v1/accounts
https://api.yourfintech.com/v2/accounts

Header Versioning for Breaking Changes:

Accept: application/vnd.yourfintech.v1+json
Content-Type: application/vnd.yourfintech.v1+json

Versioning Strategy for Financial Services:
- v1.0 → v1.1: Additive changes only (new fields, new endpoints)
- v1.x → v2.0: Breaking changes (field removals, behavior changes)
- Deprecation Timeline: 12-month notice for breaking changes
- Backward Compatibility: Support 2-3 major versions simultaneously

Authentication and Authorization Patterns

OAuth 2.0 with PKCE for Customer Applications:

Authorization: Bearer eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9...

API Key Authentication for Server-to-Server:

Authorization: Bearer sk_live_51H3...
X-Request-ID: req_1234567890

Scope-Based Authorization:

{
  "scopes": [
    "accounts:read",
    "transactions:read",
    "transfers:write",
    "cards:manage"
  ]
}

Multi-Factor Authentication for Sensitive Operations:
- Step-up authentication for large transfers
- Device fingerprinting and geolocation checks
- Time-based one-time passwords (TOTP) for administrative operations

Rate Limiting and Throttling

Rate Limiting Strategy for Financial APIs:

X-RateLimit-Limit: 1000
X-RateLimit-Remaining: 999
X-RateLimit-Reset: 1642262400
X-RateLimit-Window: 3600

Tiered Rate Limits:
- Customer-facing APIs: 100 requests/minute per user
- Partner APIs: 1000 requests/minute per partner
- Internal APIs: 10,000 requests/minute per service
- Public APIs: 60 requests/minute per IP address

API Documentation and Developer Experience

Interactive API Documentation:
Using OpenAPI 3.0 specifications with tools like Swagger UI or Redoc to provide interactive documentation where developers can test API calls directly in their browser.

SDK Generation:
Automatically generated SDKs in multiple languages (Python, Node.js, Java, PHP) from OpenAPI specifications to reduce integration friction.

Comprehensive Examples:
Every API endpoint should include realistic examples with actual financial data structures, error scenarios, and edge cases.

Developer Onboarding:
- Sandbox environment with test data
- API key management dashboard
- Integration guides and tutorials
- Code samples in multiple languages
- Webhook testing tools


Microservices Architecture Patterns

Microservices vs. Monolith: The Decision Framework

The choice between microservices fintech architecture and monolithic design isn't binary—it's about finding the right balance for your current stage and strategic objectives.

When to Start with a Monolith:
- Early Stage: Team size < 10 engineers, MVP development, uncertain requirements
- Simple Domain: Single product, straightforward business logic, limited integrations
- Rapid Iteration: Need to move fast, requirements changing frequently
- Resource Constraints: Limited DevOps experience, infrastructure management overhead concerns

When Microservices Make Sense:
- Team Scale: Multiple engineering teams (>15 engineers total)
- Domain Complexity: Multiple products, complex business rules, extensive third-party integrations
- Performance Requirements: Different scaling needs for different components
- Organizational Structure: Teams aligned around business capabilities (Conway's Law)
- Regulatory Requirements: Need to isolate compliance-sensitive components

The Pragmatic Path: Start with a well-structured monolith, then extract microservices as specific needs emerge. This approach, sometimes called "monolith first," allows you to understand your domain boundaries before committing to service boundaries.

When Microservices Make Sense

Clear Indicators for Microservice Extraction:

Different Scaling Requirements:

Payment Processing: High transaction volume, CPU-intensive
Customer Communications: Batch processing, I/O-intensive
Fraud Detection: Real-time ML inference, memory-intensive
Reporting: Periodic heavy queries, storage-intensive

Team Ownership Boundaries:
- Core Banking Team → Account Service + Ledger Service
- Payments Team → Card Service + ACH Service + Wire Service
- Compliance Team → KYC Service + AML Service + Reporting Service
- Customer Experience Team → Notification Service + Support Service

Technology Stack Optimization:
- Real-time Services: Go or Java for high-performance transaction processing
- ML Services: Python for fraud detection and risk scoring
- Data Services: Scala with Apache Spark for analytics workloads
- Customer Services: Node.js for API-heavy customer-facing applications

Service Boundaries in Financial Systems

Domain-Driven Design for Financial Services:

Core Banking Bounded Context:
- Account Management
- Ledger Operations
- Balance Management
- Interest Calculation

Payments Bounded Context:
- Card Authorization
- Settlement Processing
- ACH Operations
- Wire Transfers

Compliance Bounded Context:
- KYC Verification
- AML Monitoring
- Sanctions Screening
- Regulatory Reporting

Customer Experience Bounded Context:
- User Authentication
- Notification Delivery
- Support Case Management
- Document Management

Inter-Service Communication Patterns

Synchronous Communication (REST/gRPC):
Use for real-time operations where immediate consistency is required:

// Account Service → Compliance Service
ComplianceCheck result = complianceService.checkTransaction(
    customerId, 
    transactionAmount, 
    merchantInfo
);

if (result.isApproved()) {
    processTransaction();
} else {
    rejectTransaction(result.getReason());
}

Asynchronous Communication (Event-Driven):
Use for eventual consistency scenarios and system decoupling:

{
  "eventType": "account.created",
  "timestamp": "2026-01-15T10:30:00Z",
  "data": {
    "accountId": "acc_123456",
    "customerId": "cust_789012",
    "accountType": "checking",
    "initialDeposit": "1000.00"
  }
}

Service Mesh Architecture:
Using tools like Istio or Linkerd for service-to-service communication:
- Traffic Management: Load balancing, circuit breaking, retries
- Security: mTLS encryption, service authentication
- Observability: Distributed tracing, metrics collection
- Policy Enforcement: Rate limiting, access control

Data Consistency in Distributed Systems

Eventual Consistency Patterns:

Saga Pattern for Distributed Transactions:

Transfer Money Saga:
1. Debit Source Account → Success
2. Credit Destination Account → Success  
3. Send Confirmation Notification → Success
4. Update Analytics Database → Success

If any step fails: Execute compensating transactions in reverse order

Event Sourcing for Audit Trails:
Instead of storing current state, store all events that led to that state:

{
  "accountId": "acc_123456",
  "events": [
    {
      "eventType": "AccountOpened",
      "timestamp": "2026-01-01T09:00:00Z",
      "data": { "initialBalance": "0.00" }
    },
    {
      "eventType": "DepositReceived", 
      "timestamp": "2026-01-02T14:30:00Z",
      "data": { "amount": "1000.00", "source": "wire_transfer" }
    },
    {
      "eventType": "WithdrawalProcessed",
      "timestamp": "2026-01-03T11:15:00Z", 
      "data": { "amount": "50.00", "destination": "atm_withdrawal" }
    }
  ]
}

Transaction Management Across Services

Two-Phase Commit (2PC) - Use Sparingly:
Only for critical operations where strong consistency is absolutely required:

Phase 1 - Prepare:
- Account Service: "Can debit $100?" → Yes, funds reserved
- Ledger Service: "Can record transaction?" → Yes, prepared
- Notification Service: "Can send confirmation?" → Yes, prepared

Phase 2 - Commit:
- Account Service: "Commit debit" → Success
- Ledger Service: "Commit record" → Success  
- Notification Service: "Send notification" → Success

Compensation-Based Transactions (Preferred):

Happy Path:
1. CreateTransfer → Success
2. DebitAccount → Success
3. CreditAccount → Success
4. SendNotification → Success

Failure Scenario:
1. CreateTransfer → Success
2. DebitAccount → Success
3. CreditAccount → Failure
4. CompensateDebit → Success (reverses step 2)
5. CancelTransfer → Success (reverses step 1)

Event-Driven Architecture

Event Streaming with Apache Kafka:

Topics:
- account-events: Account lifecycle events
- transaction-events: Payment and transfer events
- compliance-events: KYC, AML alerts and updates  
- customer-events: User interactions and preferences

Event Schema Evolution:

{
  "schema": "transaction.v2",
  "compatibility": "backward",
  "data": {
    "transactionId": "txn_123",
    "amount": "100.00",
    "currency": "USD",
    "merchantInfo": {
      "name": "Coffee Shop",
      "category": "restaurant",
      "location": "New York, NY"
    },
    "enrichedData": {
      "customerSegment": "premium",
      "riskScore": 0.1
    }
  }
}

Service Mesh and API Gateway Patterns

API Gateway Responsibilities:
- Authentication & Authorization: Centralized security enforcement
- Rate Limiting: Request throttling and quota management
- Request/Response Transformation: Protocol translation and data formatting
- Monitoring & Analytics: Request logging and performance metrics
- Circuit Breaking: Fault tolerance and fallback handling

Service Mesh Benefits:
- Zero-Trust Security: Every service-to-service call is encrypted and authenticated
- Observability: Automatic metrics, logging, and tracing for all communications
- Traffic Management: Load balancing, canary deployments, A/B testing
- Resilience: Automatic retries, timeouts, and circuit breaking

Example Service Mesh Configuration:

apiVersion: networking.istio.io/v1alpha3
kind: VirtualService
metadata:
  name: payment-service
spec:
  http:
  - match:
    - headers:
        canary:
          exact: "true"
    route:
    - destination:
        host: payment-service
        subset: v2
      weight: 100
  - route:
    - destination:
        host: payment-service  
        subset: v1
      weight: 100

Database Architecture and Data Management

Database Selection for Different Use Cases

The Polyglot Persistence Approach: Different data patterns require different database technologies. A modular banking stack should use the right database for each specific use case rather than forcing all data into a single database type.

PostgreSQL for Core Banking:
- ACID compliance essential for financial transactions
- JSON/JSONB support for flexible document storage within relational structure
- Excellent performance for complex queries and reporting
- Rich ecosystem of extensions and tools
- Battle-tested in high-stakes financial environments

Redis for Real-Time Data:
- Session management for customer applications
- Real-time balance caching for high-frequency lookups
- Rate limiting data for API throttling
- Fraud detection feature storage for ML models

Apache Cassandra for Time-Series Data:
- Transaction history with massive scale and fast writes
- Audit logs with high availability and partition tolerance
- Customer activity tracking across multiple dimensions
- IoT data from payment terminals and ATMs

Elasticsearch for Search and Analytics:
- Transaction search across accounts, merchants, and time ranges
- Compliance reporting with complex aggregations
- Customer support tools for account research
- Business intelligence dashboards and analytics

SQL vs. NoSQL in Financial Services

When to Use SQL (PostgreSQL, MySQL):
- Core banking operations requiring ACID transactions
- Regulatory reporting needing complex joins and aggregations
- Financial calculations where data consistency is critical
- Audit trails requiring strict referential integrity

When to Use NoSQL:

Document Stores (MongoDB) for:
- Customer profiles with varying data structures
- Product configurations that evolve frequently
- Application logs with flexible schema requirements

Key-Value Stores (Redis) for:
- Session data requiring fast access
- Configuration data that changes infrequently
- Caching layers to reduce database load

Wide Column (Cassandra) for:
- Time-series data with massive scale requirements
- Event logging where write performance is critical
- Data that partitions naturally by customer or time

Graph Databases (Neo4j) for:
- Fraud detection analyzing transaction patterns
- Risk assessment examining customer relationships
- Compliance monitoring tracking beneficial ownership

Data Modeling for Financial Transactions

Double-Entry Bookkeeping Schema:

CREATE TABLE accounts (
    id UUID PRIMARY KEY,
    account_number VARCHAR(20) UNIQUE NOT NULL,
    account_type VARCHAR(20) NOT NULL,
    balance_cents BIGINT NOT NULL DEFAULT 0,
    currency CHAR(3) NOT NULL DEFAULT 'USD',
    created_at TIMESTAMP NOT NULL,
    updated_at TIMESTAMP NOT NULL
);

CREATE TABLE transactions (
    id UUID PRIMARY KEY,
    transaction_id VARCHAR(50) UNIQUE NOT NULL,
    debit_account_id UUID REFERENCES accounts(id),
    credit_account_id UUID REFERENCES accounts(id),
    amount_cents BIGINT NOT NULL,
    currency CHAR(3) NOT NULL,
    description TEXT,
    transaction_date TIMESTAMP NOT NULL,
    posted_date TIMESTAMP,
    status VARCHAR(20) NOT NULL,
    created_at TIMESTAMP NOT NULL
);

CREATE TABLE ledger_entries (
    id UUID PRIMARY KEY,
    transaction_id UUID REFERENCES transactions(id),
    account_id UUID REFERENCES accounts(id),
    entry_type VARCHAR(10) NOT NULL CHECK (entry_type IN ('debit', 'credit')),
    amount_cents BIGINT NOT NULL,
    balance_after_cents BIGINT NOT NULL,
    created_at TIMESTAMP NOT NULL
);

Money Representation Best Practices:
- Store as integers: Use cents (or smallest currency unit) to avoid floating-point precision issues
- Currency codes: Always store ISO 4217 currency codes with amounts
- Decimal precision: For display, use libraries that handle decimal arithmetic correctly
- Exchange rates: Store historical rates with effective date ranges

Event Sourcing and CQRS Patterns

Event Sourcing for Financial Systems:
Instead of storing current account balances, store all events that affect the balance:

{
  "streamId": "account-acc_123456",
  "events": [
    {
      "eventId": "evt_001",
      "eventType": "AccountOpened",
      "timestamp": "2026-01-01T09:00:00Z",
      "version": 1,
      "data": {
        "accountId": "acc_123456",
        "customerId": "cust_789012",
        "accountType": "checking",
        "currency": "USD"
      }
    },
    {
      "eventId": "evt_002", 
      "eventType": "FundsDeposited",
      "timestamp": "2026-01-02T14:30:00Z",
      "version": 2,
      "data": {
        "amount": "1000.00",
        "source": "wire_transfer",
        "referenceNumber": "WIR_456789"
      }
    }
  ]
}

CQRS (Command Query Responsibility Segregation):
Separate read and write models for optimal performance:

Write Side (Commands):
DepositFunds → Event Store → Account Events

Read Side (Queries):  
Event Store → Projection Engine → Account Balance View
              ↓
Event Store → Projection Engine → Transaction History View
              ↓
Event Store → Projection Engine → Monthly Statement View

Benefits for Financial Services:
- Perfect audit trail: Every change is recorded permanently
- Temporal queries: "What was the balance on December 31st?"
- Regulatory compliance: Immutable record of all transactions
- System recovery: Rebuild state from events after system failures

Data Replication and Consistency

Master-Replica Configuration:

Primary Database (Write Operations):
- All transaction writes
- Account updates
- Balance modifications

Read Replicas (Query Operations):  
- Customer statement generation
- Reporting and analytics
- Customer service inquiries
- Mobile app balance lookups

Cross-Region Replication:

Primary Region (US-East):
- Master database for all writes
- Real-time transaction processing
- Primary customer applications

Secondary Region (US-West):
- Read replica with <100ms lag
- Disaster recovery standby
- Analytics and reporting workloads

Backup and Recovery Strategies

Recovery Time Objective (RTO) and Recovery Point Objective (RPO):

Tier 1 - Critical Systems (Core Banking):
- RTO: 15 minutes maximum downtime
- RPO: 0 data loss (synchronous replication)
- Strategy: Hot standby with automatic failover

Tier 2 - Important Systems (Customer Apps):
- RTO: 1 hour maximum downtime
- RPO: 5 minutes maximum data loss
- Strategy: Warm standby with manual failover

Tier 3 - Supporting Systems (Reporting):
- RTO: 4 hours maximum downtime
- RPO: 1 hour maximum data loss
- Strategy: Cold backup with restoration process

Backup Verification:

#!/bin/bash
# Daily backup verification script

# Test backup integrity
pg_restore --list backup_file.sql > /dev/null
if [ $? -eq 0 ]; then
    echo "✅ Backup integrity verified"
else
    echo "❌ Backup corruption detected" 
    exit 1
fi

# Test restore to isolated environment
pg_restore --dbname=test_restore backup_file.sql
# Run smoke tests on restored data
# Verify transaction totals and account balances

Data Privacy and Encryption

Encryption at Rest:
- Database encryption: AES-256 encryption for all stored data
- Key management: AWS KMS or HashiCorp Vault for key rotation
- Column-level encryption: Additional encryption for PII and PAN data

Encryption in Transit:
- TLS 1.3 for all database connections
- Certificate pinning for additional security
- VPN tunnels for cross-region replication

Data Masking for Development:

-- Production data masking for development environments
UPDATE customers SET 
    ssn = 'XXX-XX-' || RIGHT(ssn, 4),
    email = 'test+' || id || '@example.com',
    phone = '555-0100'
WHERE environment = 'development';

Performance Optimization Techniques

Indexing Strategy:

-- Core performance indexes
CREATE INDEX idx_transactions_account_date 
ON transactions(account_id, transaction_date DESC);

CREATE INDEX idx_transactions_status_created 
ON transactions(status, created_at) WHERE status = 'pending';

CREATE INDEX idx_customers_kyc_status 
ON customers(kyc_status, created_at) WHERE kyc_status != 'approved';

Query Optimization:

-- Efficient balance calculation
SELECT 
    account_id,
    SUM(CASE WHEN entry_type = 'credit' THEN amount_cents ELSE -amount_cents END) as balance_cents
FROM ledger_entries 
WHERE account_id = $1 
    AND created_at <= $2
GROUP BY account_id;

-- Optimized with materialized view for current balances
CREATE MATERIALIZED VIEW account_current_balances AS
SELECT 
    account_id,
    balance_cents,
    last_updated
FROM accounts;

-- Refresh strategy
REFRESH MATERIALIZED VIEW CONCURRENTLY account_current_balances;

Connection Pooling:

// Node.js connection pooling configuration
const pool = new Pool({
  host: process.env.DB_HOST,
  database: process.env.DB_NAME,
  user: process.env.DB_USER,
  password: process.env.DB_PASSWORD,
  port: 5432,
  max: 20, // Maximum connections in pool
  idleTimeoutMillis: 30000,
  connectionTimeoutMillis: 2000,
  ssl: {
    rejectUnauthorized: false
  }
});

Integration Patterns

Processor Integration Architecture

Modern fintech developer tools must handle complex integrations with payment processors, each with unique APIs, data formats, and operational requirements. A well-designed integration layer abstracts these differences while maintaining access to processor-specific features.

Processor Abstraction Layer:

// Common interface for all processors
class PaymentProcessor {
  async authorizePayment(paymentRequest) {
    throw new Error('Must implement authorizePayment');
  }

  async capturePayment(authorizationId, amount) {
    throw new Error('Must implement capturePayment');
  }

  async refundPayment(transactionId, amount) {
    throw new Error('Must implement refundPayment');
  }
}

// Processor-specific implementations
class GalileoProcessor extends PaymentProcessor {
  async authorizePayment(paymentRequest) {
    const response = await this.galileoClient.post('/authorize', {
      accountId: paymentRequest.accountId,
      amount: paymentRequest.amount,
      merchantInfo: paymentRequest.merchant
    });

    return this.normalizeResponse(response);
  }
}

class MarqetaProcessor extends PaymentProcessor {
  async authorizePayment(paymentRequest) {
    const response = await this.marqetaClient.post('/transactions/authorization', {
      account_token: paymentRequest.accountId,
      amount: paymentRequest.amount * 100, // Marqeta uses cents
      merchant: paymentRequest.merchant
    });

    return this.normalizeResponse(response);
  }
}

Configuration-Driven Routing:

{
  "processorRouting": {
    "default": "galileo",
    "rules": [
      {
        "condition": "amount > 10000",
        "processor": "galileo",
        "reason": "High-value transaction routing"
      },
      {
        "condition": "merchant.category == 'atm'",
        "processor": "marqeta", 
        "reason": "ATM transactions via Marqeta"
      },
      {
        "condition": "customer.segment == 'premium'",
        "processor": "galileo",
        "reason": "Premium customer preferred routing"
      }
    ]
  }
}

Failover and Circuit Breaking:

class ProcessorCircuitBreaker {
  constructor(processor, options = {}) {
    this.processor = processor;
    this.failureCount = 0;
    this.lastFailureTime = null;
    this.state = 'CLOSED'; // CLOSED, OPEN, HALF_OPEN
    this.failureThreshold = options.failureThreshold || 5;
    this.timeout = options.timeout || 60000; // 1 minute
  }

  async execute(operation, ...args) {
    if (this.state === 'OPEN') {
      if (Date.now() - this.lastFailureTime > this.timeout) {
        this.state = 'HALF_OPEN';
      } else {
        throw new Error('Circuit breaker is OPEN');
      }
    }

    try {
      const result = await this.processor[operation](...args);
      this.onSuccess();
      return result;
    } catch (error) {
      this.onFailure();
      throw error;
    }
  }

  onSuccess() {
    this.failureCount = 0;
    this.state = 'CLOSED';
  }

  onFailure() {
    this.failureCount++;
    this.lastFailureTime = Date.now();

    if (this.failureCount >= this.failureThreshold) {
      this.state = 'OPEN';
    }
  }
}

Bank System Connectivity

Core Banking Integration Patterns:

API-First Integration (Modern Banks):

class BankAPIClient {
  constructor(config) {
    this.baseURL = config.baseURL;
    this.credentials = config.credentials;
    this.timeout = config.timeout || 30000;
  }

  async createAccount(customerData) {
    const response = await this.post('/accounts', {
      customer: {
        firstName: customerData.firstName,
        lastName: customerData.lastName,
        ssn: customerData.ssn,
        dateOfBirth: customerData.dateOfBirth
      },
      accountType: 'DDA',
      initialDeposit: customerData.initialDeposit
    });

    return {
      accountId: response.accountNumber,
      routingNumber: response.routingNumber,
      status: this.mapAccountStatus(response.status)
    };
  }

  async initiateWireTransfer(transferData) {
    const response = await this.post('/wire-transfers', {
      fromAccount: transferData.fromAccount,
      beneficiary: {
        name: transferData.beneficiaryName,
        accountNumber: transferData.beneficiaryAccount,
        routingNumber: transferData.beneficiaryRouting,
        bankName: transferData.beneficiaryBank
      },
      amount: transferData.amount,
      purpose: transferData.purpose,
      reference: transferData.reference
    });

    return {
      wireTransferId: response.transferId,
      status: response.status,
      estimatedCompletion: response.valueDate
    };
  }
}

File-Based Integration (Legacy Banks):

class BankFileProcessor {
  async processACHFile(achFile) {
    // Parse NACHA format file
    const records = this.parseNACHAFile(achFile);

    // Validate file structure and totals
    this.validateFileIntegrity(records);

    // Process each transaction
    const results = [];
    for (const record of records) {
      try {
        const result = await this.processACHRecord(record);
        results.push({ record, status: 'success', result });
      } catch (error) {
        results.push({ record, status: 'error', error: error.message });
      }
    }

    // Generate return file for rejected transactions
    const returnFile = this.generateReturnFile(
      results.filter(r => r.status === 'error')
    );

    return { processed: results, returnFile };
  }

  parseNACHAFile(fileContent) {
    const lines = fileContent.split('\n');
    const records = [];

    for (const line of lines) {
      const recordType = line.substring(0, 1);

      switch (recordType) {
        case '1': // File Header
          records.push(this.parseFileHeader(line));
          break;
        case '5': // Batch Header
          records.push(this.parseBatchHeader(line));
          break;
        case '6': // Entry Detail
          records.push(this.parseEntryDetail(line));
          break;
        case '8': // Batch Trailer
          records.push(this.parseBatchTrailer(line));
          break;
        case '9': // File Trailer
          records.push(this.parseFileTrailer(line));
          break;
      }
    }

    return records;
  }
}

Third-Party API Integration

Standardized Integration Framework:

class ThirdPartyIntegration {
  constructor(config) {
    this.config = config;
    this.rateLimiter = new RateLimiter(config.rateLimit);
    this.cache = new Cache(config.cache);
    this.metrics = new MetricsCollector();
  }

  async makeRequest(endpoint, data, options = {}) {
    const startTime = Date.now();

    try {
      // Rate limiting
      await this.rateLimiter.acquire();

      // Check cache first
      const cacheKey = this.generateCacheKey(endpoint, data);
      const cached = await this.cache.get(cacheKey);
      if (cached && !options.skipCache) {
        this.metrics.recordCacheHit(endpoint);
        return cached;
      }

      // Make API request
      const response = await this.httpClient.request({
        url: `${this.config.baseURL}${endpoint}`,
        method: options.method || 'POST',
        data: data,
        headers: this.buildHeaders(options),
        timeout: options.timeout || this.config.timeout
      });

      // Cache successful responses
      if (response.status === 200 && options.cacheable) {
        await this.cache.set(cacheKey, response.data, options.cacheTime);
      }

      this.metrics.recordSuccess(endpoint, Date.now() - startTime);
      return response.data;

    } catch (error) {
      this.metrics.recordError(endpoint, error, Date.now() - startTime);

      // Retry logic for specific errors
      if (this.shouldRetry(error) && (options.retryCount || 0) < 3) {
        await this.sleep(this.getRetryDelay(options.retryCount || 0));
        return this.makeRequest(endpoint, data, {
          ...options,
          retryCount: (options.retryCount || 0) + 1
        });
      }

      throw error;
    }
  }
}

Webhook Management and Retry Logic

Webhook Delivery System:

class WebhookDeliveryService {
  constructor(config) {
    this.config = config;
    this.queue = new Queue('webhook-delivery');
    this.setupJobProcessing();
  }

  async deliverWebhook(webhook) {
    const job = {
      id: webhook.id,
      url: webhook.url,
      payload: webhook.payload,
      headers: webhook.headers,
      maxRetries: webhook.maxRetries || 5,
      retryDelay: webhook.retryDelay || 1000
    };

    await this.queue.add(job, {
      attempts: job.maxRetries,
      backoff: {
        type: 'exponential',
        delay: job.retryDelay
      }
    });
  }

  setupJobProcessing() {
    this.queue.process(async (job) => {
      const { url, payload, headers } = job.data;

      try {
        const response = await fetch(url, {
          method: 'POST',
          headers: {
            'Content-Type': 'application/json',
            'User-Agent': 'Chisel-Webhooks/1.0',
            'X-Webhook-Signature': this.generateSignature(payload),
            ...headers
          },
          body: JSON.stringify(payload),
          timeout: 10000
        });

        if (!response.ok) {
          throw new Error(`HTTP ${response.status}: ${response.statusText}`);
        }

        await this.logWebhookDelivery(job.data, 'success');
        return { status: 'delivered' };

      } catch (error) {
        await this.logWebhookDelivery(job.data, 'failed', error.message);
        throw error;
      }
    });
  }

  generateSignature(payload) {
    const hmac = crypto.createHmac('sha256', this.config.webhookSecret);
    hmac.update(JSON.stringify(payload));
    return `sha256=${hmac.digest('hex')}`;
  }
}

Batch Processing Patterns

Large File Processing:

class BatchProcessor {
  async processBatchFile(filePath, processingFunction) {
    const fileSize = await fs.stat(filePath).then(stats => stats.size);
    const chunkSize = 1024 * 1024; // 1MB chunks

    let offset = 0;
    let lineBuffer = '';
    const results = [];

    while (offset < fileSize) {
      const chunk = await this.readFileChunk(filePath, offset, chunkSize);
      const lines = (lineBuffer + chunk).split('\n');

      // Keep incomplete line for next chunk
      lineBuffer = lines.pop();

      // Process complete lines
      for (const line of lines) {
        if (line.trim()) {
          try {
            const result = await processingFunction(line);
            results.push({ line, status: 'success', result });
          } catch (error) {
            results.push({ line, status: 'error', error: error.message });
          }
        }
      }

      offset += chunkSize;
    }

    // Process final line if exists
    if (lineBuffer.trim()) {
      try {
        const result = await processingFunction(lineBuffer);
        results.push({ line: lineBuffer, status: 'success', result });
      } catch (error) {
        results.push({ line: lineBuffer, status: 'error', error: error.message });
      }
    }

    return results;
  }

  async readFileChunk(filePath, offset, length) {
    const buffer = Buffer.alloc(length);
    const fd = await fs.open(filePath, 'r');
    const { bytesRead } = await fd.read(buffer, 0, length, offset);
    await fd.close();
    return buffer.toString('utf8', 0, bytesRead);
  }
}

Real-Time vs. Asynchronous Processing

Real-Time Processing (Card Authorizations):

class RealTimeTransactionProcessor {
  async processAuthorization(authRequest) {
    const startTime = Date.now();
    const timeout = 2000; // 2 second timeout for card auth

    try {
      // Parallel processing for speed
      const [
        accountValidation,
        fraudCheck,
        balanceCheck
      ] = await Promise.all([
        this.validateAccount(authRequest.accountId),
        this.checkForFraud(authRequest),
        this.checkBalance(authRequest.accountId, authRequest.amount)
      ]);

      if (!accountValidation.valid) {
        return this.declineAuthorization('INVALID_ACCOUNT', authRequest);
      }

      if (fraudCheck.riskScore > 0.8) {
        return this.declineAuthorization('FRAUD_SUSPECTED', authRequest);
      }

      if (!balanceCheck.sufficient) {
        return this.declineAuthorization('INSUFFICIENT_FUNDS', authRequest);
      }

      // All checks passed - approve and reserve funds
      await this.reserveFunds(authRequest.accountId, authRequest.amount);

      const processingTime = Date.now() - startTime;

      return {
        approved: true,
        authorizationCode: this.generateAuthCode(),
        availableBalance: balanceCheck.availableBalance - authRequest.amount,
        processingTime
      };

    } catch (error) {
      if (Date.now() - startTime > timeout) {
        return this.declineAuthorization('TIMEOUT', authRequest);
      }

      return this.declineAuthorization('SYSTEM_ERROR', authRequest);
    }
  }
}

Asynchronous Processing (Statement Generation):

class AsynchronousStatementProcessor {
  async generateMonthlyStatements(month, year) {
    const customers = await this.getActiveCustomers();

    // Process in batches to avoid overwhelming the system
    const batchSize = 100;
    const batches = this.createBatches(customers, batchSize);

    for (const batch of batches) {
      const jobs = batch.map(customer => ({
        customerId: customer.id,
        month,
        year,
        deliveryPreference: customer.statementDeliveryPreference
      }));

      // Add jobs to queue for background processing
      await this.statementQueue.addBulk(jobs, {
        priority: 5, // Medium priority
        delay: 0,
        attempts: 3,
        backoff: 'exponential'
      });
    }

    return {
      totalCustomers: customers.length,
      batchCount: batches.length,
      estimatedCompletion: this.estimateCompletionTime(customers.length)
    };
  }

  async processStatementJob(job) {
    const { customerId, month, year } = job.data;

    try {
      // Generate statement
      const statement = await this.generateStatement(customerId, month, year);

      // Deliver based on preference
      if (statement.deliveryPreference === 'email') {
        await this.emailStatement(statement);
      } else {
        await this.mailPrintStatement(statement);
      }

      await this.recordStatementDelivery(customerId, month, year);

    } catch (error) {
      await this.recordStatementError(customerId, month, year, error);
      throw error;
    }
  }
}

Error Handling and Resilience Patterns

Comprehensive Error Handling:

class ResilientIntegrationClient {
  constructor(config) {
    this.config = config;
    this.circuitBreaker = new CircuitBreaker(config.circuitBreaker);
    this.retryPolicy = new RetryPolicy(config.retry);
    this.fallbackHandler = new FallbackHandler(config.fallback);
  }

  async execute(operation, data) {
    try {
      return await this.circuitBreaker.execute(async () => {
        return await this.retryPolicy.execute(async () => {
          return await this.performOperation(operation, data);
        });
      });
    } catch (error) {
      // Circuit breaker is open or retries exhausted
      return await this.fallbackHandler.handle(operation, data, error);
    }
  }

  async performOperation(operation, data) {
    const startTime = Date.now();

    try {
      const result = await this.apiClient[operation](data);

      this.metrics.recordSuccess(operation, Date.now() - startTime);
      return result;

    } catch (error) {
      this.metrics.recordError(operation, error, Date.now() - startTime);

      // Categorize error for appropriate handling
      if (this.isRetryableError(error)) {
        throw new RetryableError(error.message, error);
      } else if (this.isCircuitBreakerError(error)) {
        throw new CircuitBreakerError(error.message, error);
      } else {
        throw new NonRetryableError(error.message, error);
      }
    }
  }

  isRetryableError(error) {
    const retryableCodes = [408, 429, 500, 502, 503, 504];
    const retryableNetworkErrors = ['ECONNRESET', 'ETIMEDOUT', 'ENOTFOUND'];

    return retryableCodes.includes(error.status) ||
           retryableNetworkErrors.includes(error.code);
  }
}

Developer Experience and Velocity

Why Developer Experience Matters

Developer experience (DX) in fintech isn't just about convenience—it's about velocity, quality, and ultimately, competitive advantage. When engineers spend less time fighting tools and infrastructure, they spend more time building features that customers value.

The Compound Effect of Good DX:
- Faster onboarding: New engineers productive in days, not weeks
- Reduced context switching: Consistent tools and patterns across all services
- Fewer production issues: Better testing and debugging tools catch problems early
- Higher job satisfaction: Engineers enjoy working with well-designed systems
- Faster feature delivery: Less time spent on tooling means more time building

Bad DX Costs:
Recent studies show that poor developer experience can reduce team productivity by 50-70%. In fintech, where regulatory requirements already slow development, poor DX can be the difference between market leadership and irrelevance.

Local Development Environments

Containerized Development with Docker Compose:

# docker-compose.dev.yml
version: '3.8'
services:
  postgres:
    image: postgres:14
    environment:
      POSTGRES_DB: fintech_dev
      POSTGRES_USER: dev
      POSTGRES_PASSWORD: devpass
    ports:
      - "5432:5432"
    volumes:
      - postgres_data:/var/lib/postgresql/data
      - ./scripts/init-db.sql:/docker-entrypoint-initdb.d/init-db.sql

  redis:
    image: redis:7-alpine
    ports:
      - "6379:6379"

  core-banking-service:
    build: 
      context: ./services/core-banking
      dockerfile: Dockerfile.dev
    environment:
      DATABASE_URL: postgresql://dev:devpass@postgres:5432/fintech_dev
      REDIS_URL: redis://redis:6379
      LOG_LEVEL: debug
    ports:
      - "3001:3000"
    volumes:
      - ./services/core-banking:/app
      - /app/node_modules
    depends_on:
      - postgres
      - redis

  payment-service:
    build: 
      context: ./services/payments
      dockerfile: Dockerfile.dev
    environment:
      DATABASE_URL: postgresql://dev:devpass@postgres:5432/fintech_dev
      CORE_BANKING_URL: http://core-banking-service:3000
      PROCESSOR_API_URL: https://sandbox.galileo.com
    ports:
      - "3002:3000"
    depends_on:
      - core-banking-service

Development Scripts:

{
  "scripts": {
    "dev:setup": "docker-compose -f docker-compose.dev.yml up -d postgres redis && npm run db:migrate",
    "dev:start": "docker-compose -f docker-compose.dev.yml up",
    "dev:clean": "docker-compose -f docker-compose.dev.yml down -v",
    "dev:logs": "docker-compose -f docker-compose.dev.yml logs -f",
    "dev:reset": "npm run dev:clean && npm run dev:setup",
    "test:integration": "docker-compose -f docker-compose.test.yml up --abort-on-container-exit"
  }
}

Local Testing with Real Processor Sandboxes:

// config/development.js
module.exports = {
  processors: {
    galileo: {
      apiUrl: 'https://api-sandbox.galileo-ft.com',
      apiKey: process.env.GALILEO_SANDBOX_KEY,
      programId: process.env.GALILEO_SANDBOX_PROGRAM
    },
    marqeta: {
      apiUrl: 'https://sandbox-api.marqeta.com',
      username: process.env.MARQETA_SANDBOX_USER,
      password: process.env.MARQETA_SANDBOX_PASS
    }
  },
  features: {
    enableMockData: true,
    bypassKYC: true,
    enableDebugLogging: true
  }
};

Testing Strategies for Financial Systems

Test Pyramid for Financial Services:

Unit Tests (70% of tests):

// Example: Balance calculation unit tests
describe('AccountService', () => {
  describe('calculateAvailableBalance', () => {
    it('should calculate correct balance with pending transactions', () => {
      const account = {
        currentBalance: 1000.00,
        pendingDebits: [50.00, 25.00],
        pendingCredits: [100.00]
      };

      const availableBalance = AccountService.calculateAvailableBalance(account);

      expect(availableBalance).toBe(1025.00); // 1000 - 75 + 100
    });

    it('should handle overdraft scenarios correctly', () => {
      const account = {
        currentBalance: 100.00,
        pendingDebits: [150.00],
        pendingCredits: [],
        overdraftLimit: 500.00
      };

      const availableBalance = AccountService.calculateAvailableBalance(account);

      expect(availableBalance).toBe(450.00); // 100 - 150 + 500 overdraft
    });
  });
});

Integration Tests (20% of tests):

describe('Payment Processing Integration', () => {
  it('should successfully process end-to-end card transaction', async () => {
    // Given: A customer with sufficient balance
    const customer = await createTestCustomer({
      initialBalance: 500.00
    });

    // When: A card transaction is processed
    const transaction = {
      accountId: customer.accountId,
      amount: 25.99,
      merchant: {
        name: 'Test Coffee Shop',
        category: '5814' // Restaurant MCC
      }
    };

    const result = await PaymentService.processCardTransaction(transaction);

    // Then: Transaction should be approved and balance updated
    expect(result.approved).toBe(true);
    expect(result.authorizationCode).toBeDefined();

    const updatedBalance = await AccountService.getBalance(customer.accountId);
    expect(updatedBalance.available).toBe(474.01); // 500 - 25.99
  });
});

End-to-End Tests (10% of tests):

describe('Customer Onboarding Flow', () => {
  it('should complete full customer onboarding journey', async () => {
    const browser = await puppeteer.launch();
    const page = await browser.newPage();

    // Step 1: Customer starts application
    await page.goto('http://localhost:3000/signup');
    await page.fill('#firstName', 'John');
    await page.fill('#lastName', 'Doe');
    await page.fill('#email', '[email protected]');
    await page.click('button[type="submit"]');

    // Step 2: Identity verification
    await page.waitForSelector('#identity-verification');
    await page.setInputFiles('#drivers-license', 'test-files/sample-dl.jpg');
    await page.click('#verify-identity');

    // Step 3: Wait for KYC approval
    await page.waitForSelector('.kyc-approved', { timeout: 30000 });

    // Step 4: Account creation
    await page.click('#create-account');

    // Step 5: Verify account is created
    const accountNumber = await page.textContent('#account-number');
    expect(accountNumber).toMatch(/^\d{10}$/);

    await browser.close();
  });
});

Contract Testing for API Integration:
```javascript
// Using Pact for consumer-driven contract testing
const { Pact } = require('@pact-foundation/pact');

describe('Core Banking API Contract', () => {
const provider = new Pact({
consumer: 'payment-service',
provider: 'core-banking-service'
});

it('should get account balance', async () => {
await provider
.given('account exists with balance')
.uponReceiving('a request for account balance')
.withRequest({
method: 'GET',
path: '/accounts/acc_123456/balance',
headers: {
'Authorization': 'Bearer token123'
}
})
.willRespondWith({
status: 200,
headers: {
'

About the Author

Matt Anderton, Chief Technology Officer, Chisel

Matt brings over 15 years of experience building financial infrastructure systems, from high-frequency trading platforms to modern fintech architecture. As Chisel's CTO, Matt leads the technical vision for composable banking infrastructure that enables fintechs to own their technology stack without sacrificing velocity.

Prior to Chisel, Matt co-founded BluBox where he architected scalable payment systems processing billions in transaction volume. His experience spans distributed systems design, API architecture, microservices patterns, and developer experience optimization for financial services.

Matt is passionate about making complex infrastructure accessible to developers and frequently contributes to open-source projects related to financial technology. He holds a degree in Computer Science and has spoken at numerous fintech and developer conferences.

Connect with Matt:
- LinkedIn
- Learn more about Chisel's Technology


Published: January 2026 | Last Updated: January 2026
Reading Time: 18 minutes | Word Count: 3,800

Tags: #composable architecture #fintech API #microservices #banking infrastructure #developer experience


Back to Blog