← Y2KDASH // field manual · latency vs bandwidth
> field manual · the confusion

Latency vs Bandwidth: Why Your Gigabit Connection Feels Slow

The internet advertising industry spent 25 years training consumers to think of "speed" as a single number in Mbps. That number measures bandwidth — the capacity of the pipe. It does not measure latency — how fast an individual packet can cross the pipe and get back. For almost everything humans actually do online, latency matters more. This page is the factual breakdown of the difference and why getting it wrong leaves people paying for gigabit plans that feel slow.

PUBLISHED 2026-04-22 · ~7 MIN READ · FIELD NOTES FROM Y2KDASH

The two-variable picture

A network connection has two fundamental properties. Think of the connection as a tunnel.

These two properties are independent. A tunnel can be wide and long (high bandwidth, high latency — typical of satellite internet). A tunnel can be narrow and short (low bandwidth, low latency — typical of DSL to a nearby ISP). Improving one doesn't automatically improve the other. Critically, upgrading a plan's Mbps tier only makes the tunnel wider, not shorter.

A 1 Gbps plan and a 100 Mbps plan on the same physical infrastructure typically have identical latency. The only thing that changes is throughput ceiling.

What each property actually determines

activitywhat matterstypical threshold
4K Netflix / YouTubebandwidth25 Mbps sustained
Zoom HD video call (two people)bandwidth + latency3 Mbps + sub-150 ms
Competitive FPS gaminglatencyunder 30 ms; 5 Mbps plenty
Cloud gaming (GeForce Now)both40 Mbps + sub-40 ms
Web browsinglatency-dominatedsub-100 ms; bandwidth over 10 Mbps hardly matters
File downloadbandwidthwhatever bandwidth your plan has
File upload to cloudbandwidth (upload tier)upload Mbps
Voice call (Discord, phone)latency + jittersub-80 ms; 64 kbps is enough
SSH / remote terminallatencyunder 50 ms for comfort
Multi-user household streamingbandwidth25 Mbps × streams

The pattern: bulk data transfer is bandwidth-bound (downloads, streaming, backups). Interactive activity is latency-bound (gaming, voice, browsing, terminal work). Most human time online is interactive. Therefore, for most people's day-to-day experience, latency matters more than bandwidth — as long as bandwidth exceeds a modest threshold (25-50 Mbps for a household).

Why "fast internet feels slow"

The most common version of this complaint: someone upgrades from a 300 Mbps plan to a 1 Gbps plan and reports that the new plan "doesn't feel any faster." Reason: the things they spend most of their time on (web browsing, video calls, game sessions) were never bandwidth-limited on the old plan. The 300 Mbps plan had more than enough bandwidth for their actual use. The bottleneck was somewhere else — usually latency, often bufferbloat — and upgrading bandwidth didn't address it.

The only activities where the upgrade shows up are pure bandwidth tasks: downloading a 50 GB game completes in a quarter the time; uploading a 4K home video to YouTube finishes faster. For everything interactive, the experience is identical because the interactive components are latency-bound and latency was unchanged.

What determines latency in the real world

Latency, in decreasing order of leverage:

Note that your ISP plan tier is not in this list. Switching from 100 Mbps to 1 Gbps on the same ISP, same physical infrastructure, same region, doesn't change any of the five variables that determine latency.

The right way to think about internet speed

The honest mental model for a home internet connection has two sliders, not one. The bandwidth slider determines the ceiling for bulk data. The latency slider determines the floor for interactive experience. Both matter, but the bandwidth slider reaches "enough" well below what most modern plans offer (25-50 Mbps is plenty for a small household). The latency slider has no such ceiling — better latency always improves interactive experience.

When evaluating a connection or an upgrade, ask both questions:

  1. Do I have enough bandwidth for what I actually do? Streamers + gamers + multiple devices: yes, 300+ Mbps helps. Single household, no heavy simultaneous 4K streaming: 50-100 Mbps is usually fine.
  2. How does my latency behave under real use? Run Y2KDASH for 10 minutes while doing what you normally do. If loaded latency stays under 80 ms and jitter stays under 10 ms, the connection is solid. If loaded latency spikes 200+ ms when anything saturates the link, the problem is bufferbloat, not plan tier — and no upgrade fixes it.

When bandwidth genuinely matters

Bandwidth matters when a household hits its ceiling. Symptoms: multiple simultaneous 4K streams buffering; a Steam download slowing the whole network to a crawl; cloud backups taking all night. If these specific patterns are happening regularly, an upgrade to a higher bandwidth tier is justified. Run a speed test during peak household use to confirm the ceiling is actually being hit — often the apparent bandwidth bottleneck is actually bufferbloat mimicking one.

For everyone else, the money spent on a gigabit upgrade is often better spent on: a router with CAKE queue management ($150-300), wired Ethernet to key devices, a WiFi 6E or WiFi 7 access point in the right location. These change latency and consistency, which is usually the actual bottleneck.

FAQ

What's the difference between latency and bandwidth?

Bandwidth is how much data can move per second, measured in Mbps. Latency is how long a single round-trip takes, measured in milliseconds. Bandwidth determines whether a 4K stream buffers; latency determines whether a video call feels natural. Independent measurements — a connection can have high bandwidth and high latency, or low bandwidth and low latency.

Does upgrading to gigabit lower my latency?

No. Upgrading 300 Mbps to 1 Gbps increases capacity but does not change latency. Latency is set by physical distance, number of network hops, and queuing delays. None of these change when the plan tier changes.

Which matters more for gaming, latency or bandwidth?

Latency by a wide margin. Competitive games exchange tiny packets many times per second. The 5 Mbps of bandwidth a game uses is trivial. The 20 ms vs 80 ms of latency determines whether the server registers a shot before or after an opponent.

Which matters more for streaming, latency or bandwidth?

Bandwidth, for most streaming. Netflix 4K needs about 25 Mbps sustained; YouTube 4K similar. As long as bandwidth exceeds bitrate, latency barely matters because players buffer seconds of content. Low-latency streaming (live sports) cares about latency more but threshold is still 200 ms.

What is a good latency?

Under 30 ms to same-region servers is excellent. 30-60 ms is good. 60-100 ms is acceptable for most uses. Over 100 ms is noticeable in real-time applications. Physical distance sets the floor — cross-continent traffic is always above 70 ms. See What's a Good Ping for thresholds by use case.

measure both, not just bandwidth

Y2KDASH samples bandwidth AND latency AND jitter AND packet loss continuously. The full picture, not just the number your ISP wants to show.

> LAUNCH Y2KDASH →

Related: What's a Good Ping? · The Speed Test Lie · How to Fix Bufferbloat · Glossary