arrow_backBack to Blog

March 15, 2024

Introducing the VivaPay Smart Routing Engine v2.5

Our latest routing engine leverages real-time latency telemetry.

Overview

The VivaPay Smart Routing Engine v2.5 represents a fundamental shift in how we approach payment rail selection. Rather than relying on static routing tables, our engine continuously evaluates latency, success rate, and fee data across all connected rails in real-time.

What's New in v2.5

ML-Powered Rail Selection

Our new gradient-boosted model processes over 40 signals per transaction — including merchant category, currency pair, time of day, and historical rail performance — to predict the optimal route before the transaction is even initiated.

Sub-12ms Routing Decisions

Through a combination of in-memory caching and predictive pre-loading, we've reduced average routing decision time from 18ms to 11.2ms — a 38% improvement.

Adaptive Fallback Chains

When a primary rail experiences degraded performance, the engine immediately re-routes through the next best option with zero manual intervention required.

Technical Architecture

The engine runs as a stateless microservice, receiving transaction context and emitting a ranked rail list. This design means horizontal scaling is trivial — we simply add more engine instances behind our load balancer.

Rail performance data is aggregated every 500ms via our telemetry pipeline, ensuring routing decisions are based on current network conditions rather than stale data.

Results in Production

In the first 30 days since v2.5 went live across all customer accounts:

  • **Transaction failure rate** dropped from 0.16% to 0.04%
  • **Average settlement latency** decreased by 22%
  • **Fee optimization savings** averaged 1.8% per transaction volume
  • Getting Started

    If you're already integrated with VivaPay, v2.5 is live for your account today — no code changes required. Check your dashboard for the updated routing analytics view.

    For new integrations, head to our [documentation](/docs) to get started.