How Locally Embedded Caches Fix the Capacity Problem for ISPs (and Why Content Providers Should Care)
Introduction
Two Challenges for ISPs
- Time-to-First-Byte increases–Meaning slower start-up times.
- Rebuffering rate increases–The spinning ball shows up more often and for longer.
- The average bitrate goes down–Lower quality picture resolution for the viewer (e.g. getting a standard definition version when you expect 1080p), and packets getting dropped (e.g. the stream gets cut off so you have to press play again).
What About Content Providers?
How Upstream Capacity Impacts QoE
- An upstream network provides an ISP with 10Gbps capacity.
- The ISP has 10,000 subscribers.
- Twenty-five percent of those subscribers decide to stream a popular sporting event, like an NFL football game, at the start of the game.
- Because the ISP offers symmetrical gig fiber, and players always try to get the highest possible bitrate, these 2,500 subscribers are all requesting the same 4.5Mbps stream simultaneously.
But it doesn’t stop there:
- Another 10% of subscribers join about 30 minutes into the game.
- Thankfully, there is still some capacity left considering the 11 gbps exceeded the original 10 gbps capacity (especially given that other subscribers were likely requesting other internet content that also needed to be backhauled from the upstream provider). The players responded by requesting lower bitrates (and buffering).Let’s say they all dropped down to 2.5Mbps. That leaves about 1Gbps of capacity.
- But these 1000 new subscribers hit the highest bitrate first, requesting 4.5 gbps of content.
- This results in slower load times for all subscribers because their players immediately downshift to lower bitrates, causing everyone to drop another bitrate or two.
- Of course, this whole situation continues to fluctuate–not only as people drop their stream or request the live feed–but also as other subscribers engage with other internet content. It’s a mess.
How Caching Fixes The Capacity Problem (Without More Capacity)
The Benefits of Embedded Caches
Request collapsing:
Higher average bitrate:
Less buffering time:
Faster startup time:
Small Investment, Big Impact
Here’s the good news: investing in locally embedded caches won’t break the bank for rural ISPs, and Content Providers can partner with ISPs already utilizing embedded caching today.
For ISPs looking to implement a locally embedded cache, there are options. ISPs that want to be in the business of managing a CDN can build and implement their own caches, using open source software like Varnish or NGINX. Now, ISPs and Content Providers have another option: partnering with a technology solution like Netskrt. With Netskrt, ISPs can implement a last-mile CDN to deliver the high-quality streaming video that subscribers–and Content Providers–expect, regardless of traffic spikes or multi-hop network locations. More than just a cache, Netskrt appliances are fully-serviced and intelligently managed, enabling content pre-positioning. In other words, Netskrt technology allows Content Providers to intentionally cache popular content so that upstream backhaul happens one time only: when getting the content into the embedded cache.
ISPs that implement embedded caches reap many benefits, and Content Providers partnering with those ISPs can reap even more. With a last-mile CDN like Netskrt, ISPs can better serve their subscribers and their content provider partners, thanks to an easy, fully-managed service that’s constantly monitored for optimal performance.
By implementing locally embedded caches in hard-to-reach subscriber networks, ISPs and Content Providers are guaranteed to improve overall viewer satisfaction, from watching the big game to accessing the big gaming download.
Check out the Netskrt whitepaper and find out how a regional ISP in New York state, deployed Netskrt’s last- mile CDN to improve their viewer QoE during live sports streaming events.