Video caching, like any other type of data caching, temporarily stores frequently accessed videos or video segments close to where viewers are located on the network. Bandwidth optimization occurs because video no longer has to travel the entire length of the corporate network.
Caches are network services—often software running on virtual machines, normally located in a data center—and are persistently available for the devices (personal computers, smartphones, and tablets) that need to use them. Because caching is demand-driven, like peer-to-peer networking, it works for both live and on-demand video. However, unlike peer-to-peer, the constant availability of the cache makes it a better solution for the opportunistic caching of video on demand (VOD).
In computing, a cache is a hardware or software component that stores data so that future requests for that data can be served faster; the data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere
When to Use a Video Caching eCDN?
As you explore which eCDN is right for your enterprise, start by learning how each technology works. Then evaluate it based on your use cases and the configuration of your network.
Here are a few reasons why a video caching eCDN might be right for your enterprise:
- You stream both live broadcasts and video on demand (VOD)
- Portions or all of your network is not multicast-enabled
- You don’t want to manage and deploy client software
- Some users view your business videos using mobile devices like smartphones or tablets
- Security is paramount
- Remote workers access your enterprise videos using VPN
Video caching is by far one of the most versatile eCDN solutions on the market today. But once you evaluate the total landscape of your streaming environment, you might need a mix of eCDN solutions.
Software or hardware is placed in strategic locations around the network and close to concentrations of viewers. When the first viewer requests a video, the cache retrieves it from the origin video source, such as Microsoft Teams, Webex or Zoom, and stores a local copy.
When other viewers in the same location request the same video, they receive it directly from the local cache, not the source. As a result, the video has less distance to travel, and fewer duplicate copies are transmitted across the corporate internet connections and WAN links.
Can I Just Use a Regular Caching Solution?
In theory, you could. But using a WAN accelerator is not the best way to distribute video. That’s because WAN accelerators were not made to cache all file types. They were designed to cache files from the internet, such as JPEG files and text files. These file types are pretty small in comparison to a video file.
Live Events and VOD
Because a video cache is designed specifically for video, it maximizes efficiency while minimizing latency. Video caches store data in larger chunks than all-purpose caches so they can reassemble the files faster and minimize processing time.
In addition, a robust video cache will store data differently depending on whether the video is a live event or pre-recorded video. During live events, a large audience is not only accessing a video at the same time, the viewers are watching the same segments of the video at essentially the same time. Time is of the essence to minimize latency and keep everyone on the same page (so to speak!). Therefore, a cache that uses memory (RAM) for storage is ideal because it can retrieve video segments, store them, and serve them in a very short amount of time.
With video on demand, viewership is more staggered. Caches need to store multiple segments, if not entire videos, to ensure data is available for subsequent requests. A large cache is the key for getting the best viewing experience with the most bandwidth optimization, so caching on disk works well.
Furthermore, many solutions allow you to store recorded video on the cache when you anticipate high demand for a video. You can pre-position videos minutes, hours or even days before a video is released or announced to ensure 100% of requests are served from the cache.
Caching is a simple set-it-and-forget-it type of eCDN. You size the caches. Deploy them where you need them. Tell user devices how to reach them. And it just works—no client needed.
Most importantly, caching works well for both live events and VOD. The network impact is relatively easy to work out, and while not as deterministic as multicast, it is considerably better than peer-to-peer.
Load-balancing, clustering, and certificates are the most complex elements of caching. But these configurations are much the same as any load-balanced or clustered service. The biggest drawback is the need to place the caches on the network close to users, which depends entirely on the availability of infrastructure that can be used for this purpose.
Vbrick Ramp edge caching is an advanced video caching eCDN that replaces expensive infrastructure upgrades with a lower cost, flexible solution. This lightweight, software-only solution runs on your existing infrastructure and supports every modern streaming platform.
Deploy and manage OmniCache 100% behind your firewall and use its built-in security to prevent unauthorized access to your videos. You can also pre-position videos during times of low network activity to avoid spikes in bandwidth during large-scale, VOD events such as pre-recorded executive messages or required trainings.
Because Vbrick Ramp edge caching supports both live and on-demand video, it can serve as your only eCDN. Or you can mix and match it with our other eCDN solutions, Multicast and/or peer-to-peer.
- One solution for live and VOD
- No client software or plugins
- Any device, including mobile
- Supports any streaming video source; simultaneous support for multiple sources
- 100% behind the firewall
- Encryption to protect videos in transit and at rest
- Simple, one-time setup
- Scales easily as demand grows
- Centralized management, monitoring & insightful analytics
- VOD pre-positioning
- Self discoverable and self-healing