← All Posts

ngrok vs Cloudflare Tunnel vs LivePort: Honest Comparison with Real Benchmarks (2026)

I ran all three against the same test server. Here are the actual numbers — setup time, latency, variance, and the gotcha Cloudflare's docs don't mention.

By David Dundas, Builder, Derivative Labs//10 min read

ngrok vs Cloudflare Tunnel vs LivePort: Honest Comparison (With Real Numbers)

For AI agent workflows (Claude Code, Cursor, Cline), LivePort is the right choice — it's the only tool with a native waitForTunnel() SDK method, and the only one that gives every free-tier user a persistent subdomain. For large-payload performance, Cloudflare Tunnel wins. For brand recognition and ecosystem breadth, ngrok wins. The decision comes down to what you're actually building.

I ran all three against the same Bun test server — two endpoints, multiple request sizes — and recorded everything. Cloudflare Tunnel is faster for large payloads. LivePort is more consistent. ngrok charges you for the features that matter. All three add roughly 150-250ms over direct localhost, which is fine for development.

Let's get into it.


The Setup Test

Before benchmarking, I measured time-to-working-tunnel from a cold start.

ngrok

ngrok http 4567
Step Notes
Account + auth token One-time setup
Run command Immediate
URL assigned Rotating — new URL every restart
Time to working tunnel ~30 seconds (first time), ~5 seconds after

The free tier gotcha: Every request through a free ngrok tunnel gets an interstitial page — a browser warning asking the visitor to confirm before continuing. This breaks any automated consumer. An agent, a webhook handler, or a curl call hits the warning page instead of your app. You can bypass it with the ngrok-skip-browser-warning: true header, but you have to know to do this.

Cost if you want persistent URLs + no interstitial: $10/month (Hobbyist tier).


Cloudflare Tunnel (Quick Mode)

cloudflared tunnel --url http://localhost:4567
Step Notes
Install cloudflared Homebrew or direct download
Run command
URL assigned Random subdomain on trycloudflare.com — rotates on restart
Time to working tunnel ~3 minutes — if you've never used CF Tunnel before

The gotcha nobody documents: If you've ever run cloudflared tunnel login or created a named tunnel before, the quick tunnel command silently fails. It picks up old credentials from ~/.cloudflared/ and tries to route traffic to a dead named tunnel instead of a fresh quick tunnel. Returns 404 with no error message. I hit this firsthand — took three attempts and moving old config files out of the way before it worked.

This affects anyone who's ever experimented with CF Tunnel's more powerful named-tunnel mode and then tries to use quick mode. It's a real trap.

Once it works: free, no bandwidth limit, no interstitial, URL is public and persistent per session (changes on restart, same as ngrok free).


LivePort

liveport connect 4567
Step Notes
Install CLI + create account One-time setup
Generate a Bridge Key One-time per project
Run command
URL assigned Persistent subdomain — free for all users
Time to working tunnel ~8 seconds

Worked first attempt. No config conflict, no warning page, no URL rotation with a Bridge Key.


Setup Summary

ngrok CF Quick Tunnel LivePort
Account required Yes No Yes
Worked first try Yes No (config conflict) Yes
Time to working ~30s (first time) ~3min (with debugging) ~8 seconds
URL rotates on restart Yes Yes No — persistent subdomains free for all users
Interstitial page Yes (free) No No

The Performance Benchmarks

Test environment: macOS Darwin 23.5.0, Bun runtime, residential internet in Austin TX, CF edge at dfw01 (Dallas).

Test server: Two endpoints — /api/test (~100 bytes JSON) and /api/large (~56KB JSON array).


Small Response (100 bytes)

5 requests each:

Avg Min Max Variance
Direct (localhost) 3.3ms 0.5ms 10ms Low
CF Tunnel 177ms 102ms 390ms High
LivePort 191ms 183ms 203ms Very low

CF Tunnel averaged 14ms faster, but the first request was 390ms — a cold edge connection. After warmup it stabilized at 102-145ms. LivePort stayed in a 183-203ms band across all five requests.

For automated consumers (agents, CI pipelines), predictability often matters more than raw average speed. If your agent makes five API calls through the tunnel, LivePort's consistency means you can set reliable timeouts. CF Tunnel's first-request spike can cause false failures.


Large Response (56KB)

3 requests each:

Avg Overhead vs direct
Direct (localhost) 17ms
CF Tunnel 164ms 9.6x
LivePort 365ms 21.5x

Cloudflare wins here — about 2.2x faster than LivePort for large payloads. Cloudflare's edge network and QUIC protocol are purpose-built for efficient data transfer. If you're serving full HTML pages, large JSON responses, or file downloads through the tunnel, CF Tunnel will feel meaningfully snappier.


Connection Timing (curl -w, single request)

Phase CF Tunnel LivePort
DNS lookup 25ms 3ms
TCP connect 39ms 42ms
TLS handshake 60ms 85ms
Time to first byte 259ms 216ms

LivePort's TTFB is actually faster (216ms vs 259ms), driven by faster DNS. The large-payload disadvantage is in the transfer phase — CF's edge infrastructure moves bytes faster once the connection is established.


What This Means in Practice

For typical development use:

  • API testing (your agent calls GET /api/users) — both are fine. The 14ms average difference is imperceptible.
  • Webhook testing (Stripe, GitHub sending POST requests) — CF Tunnel is slightly faster per request, but both work perfectly.
  • Full page loads / large assets — CF Tunnel is meaningfully faster. Use it for this.
  • Frequent restarts — CF Tunnel's cold-start spike (390ms first request) compounds if you're restarting often. LivePort is consistent from request one.
  • Automated consumers — LivePort's low variance is better for agents and CI. CF Tunnel's first-request spike can cause false timeouts.

The Use Case That Changes Everything

The benchmarks above assume a human developer who manually starts a tunnel, copies the URL, and pastes it somewhere.

That workflow breaks for AI agents.

Here's the problem: Claude Code, Cursor, and AI coding agents cannot access localhost. When your agent needs to test your running app, it has three options:

  1. Screenshots — "I'm losing brain cells taking screenshots of my UI and prompting Claude Code to look at them" (actual Reddit comment, r/ClaudeAI, 42+ upvotes on that thread)
  2. Background process hacks — run the dev server with &, hope it doesn't orphan, give the agent a localhost URL that still won't work
  3. Tunnel the agent doesn't know about — you manually start ngrok, manually copy the URL, manually paste it into context. Every session.

LivePort's Agent SDK is built specifically for option 4 — the one that actually works:

import { LivePortAgent } from '@liveport/agent-sdk';

const agent = new LivePortAgent({ key: 'lpk_xxx' });

// Agent blocks here — waiting for the developer to run:
// liveport connect 3000 --key lpk_xxx
const tunnel = await agent.waitForTunnel({ timeout: 30000 });
// tunnel: { tunnelId, subdomain, url, localPort, createdAt, expiresAt }

// Agent unblocks with a live URL. Tests run. No manual copy-paste.
await fetch(`${tunnel.url}/api/health`);

Neither ngrok nor Cloudflare Tunnel has anything like waitForTunnel(). They give you a URL — you still have to coordinate getting it to the agent yourself.

Bridge Keys add another layer: each key is scoped — you set expiry (1h/6h/24h), a max number of uses, and which port it can reach. Your agent gets exactly the access it needs, nothing more.


Cursor Blocks Localhost — A Separate Problem

One more use case worth naming: Cursor explicitly blocks localhost as a valid URL in its API settings field. If you're connecting Cursor to a local LLM endpoint, a local tool server, or any localhost service, you're already using ngrok as a workaround.

ngrok works, but the free tier gives you a new URL every restart. LivePort now includes persistent subdomains on every plan including free — you configure Cursor once and it reconnects on every restart without touching the URL.


When to Use Each

Use case Best choice Why
Quick webhook test (Stripe, GitHub) ngrok Most tutorials reference it, works immediately
Production-grade public URL Cloudflare Tunnel Free, permanent, edge performance
Team access to local service Tailscale Funnel VPN model means you control who can reach it
Self-hosted, full control Pangolin Open source, your server, no vendor dependency
Large file / asset serving CF Tunnel Edge network is purpose-built for this
AI agent testing (Claude Code, Cursor) LivePort waitForTunnel() is the only agent-native solution
Cursor connected to local LLM LivePort Stable subdomain survives restarts
Webhook debugging with MCP Hooklistener Purpose-built for this, MCP-native

The Honest Verdict

ngrok is the incumbent for good reason — it works, it's documented everywhere, and the paid tier ($10/mo) removes all the friction. If you're already paying or just need a quick webhook test, it's fine.

Cloudflare Tunnel is the best free option for most developers who need a permanent URL and aren't specifically building AI agent workflows. The config conflict gotcha is real but a one-time problem. Once it works, it works well — and Cloudflare's edge network is genuinely fast for large payloads.

LivePort wins on two things: the AI agent use case, and URL stability on the free tier. Persistent subdomains are now included on every plan — including free — which means LivePort free tier beats ngrok free tier on the one thing developers complain about most (rotating URLs). Pricing beyond the free tier: $0.018/hour + $0.05/GB. ngrok Pro is $20/month flat; moderate LivePort usage runs $4-5/month. waitForTunnel() solves a coordination problem that neither ngrok nor Cloudflare Tunnel addresses at all. If you're using Claude Code, Cursor, or any MCP-compatible agent to build and test your app, this is the tool the workflow was built around.


Try It

# Install LivePort CLI
npm install -g @liveport/cli

Free tier — no credit card required. 2 concurrent tunnels, 1GB/month, Bridge Keys and the Agent SDK included.

MCP server: Add @liveport/mcp to your MCP config and any MCP-compatible agent can create and manage tunnels via tool call — no CLI setup required on the agent side.

{
  "mcpServers": {
    "liveport": {
      "command": "npx",
      "args": ["@liveport/mcp"]
    }
  }
}
ngrok vs Cloudflare Tunnel vs LivePort: Honest Comparison with Real Benchmarks (2026) — LivePort Blog