What is streaming?
The first websites were simple pages of text with maybe an image or two. Today, however, anyone with a fast enough Internet connection can stream high-definition movies or make a video call over the Internet. This is possible because of a technology called streaming.
Streaming is the continuous transmission of audio or video files from a server to a client. In simpler terms, streaming is what happens when consumers watch TV or listen to podcasts on Internet-connected devices. With streaming, the media file being played on the client device is stored remotely, and is transmitted a few seconds at a time over the Internet.
What is the difference between streaming and downloading?
Streaming is real-time, and it's more efficient than downloading media files. If a video file is downloaded, a copy of the entire file is saved onto a device's hard drive, and the video cannot play until the entire file finishes downloading. If it's streamed instead, the browser plays the video without actually copying and saving it. The video loads a little bit at a time instead of the entire file loading at once, and the information that the browser loads is not saved locally.
Think of the difference between a lake and a stream: Both contain water, and a stream may contain just as much water as a lake; the difference is that with a stream, the water is not all in the same place at the same time. A downloaded video file is more like a lake, in that it takes up a lot of hard drive space (and it takes a long time to move a lake). Streaming video is more like a stream or a river, in that the video's data is continuously, rapidly flowing to the user's browser.
How does streaming work?
Just like other data that's sent over the Internet, audio and video data is broken down into data packets. Each packet contains a small piece of the file, and an audio or video player in the browser on the client device takes the flow of data packets and interprets them as video or audio.
Sending video over the Internet, as opposed to sending text and still images, requires a faster method of transporting data than TCP/IP, which prioritizes reliability over speed.
How does the User Datagram Protocol (UDP) improve streaming?
UDP is a transport protocol, meaning it's used for moving packets of data across networks. UDP is used with the Internet Protocol (IP), and together they are called UDP/IP. Unlike TCP, UDP does not send messages back and forth to open a connection before transmitting data, and it does not ensure that all data packets arrive and are in order. As a result, transmitting data does not take as long as it does via TCP, and though some packets are lost along the way, there are so many data packets involved in keeping a stream going that the user shouldn't notice the lost ones.
Much of the Internet uses TCP, or the Transmission Control Protocol. This transport protocol involves a careful back-and-forth acknowledgement in order to open a connection. Once the connection is open and the two communicating devices are sending packets back and forth, TCP ensures that the transmission is reliable, that all packets arrive in order.
For streaming, speed is far more important than reliability. For instance, if someone is watching an episode of a TV show online, not every pixel has to be present for every frame of the episode. The user would prefer to have the episode play at normal speed than to sit and wait for every bit of data to be delivered. Therefore, a few lost data packets is not a huge concern, and this is why streaming uses UDP.
If TCP is like a package delivery service that requires the recipient to sign for the package, then UDP is like a delivery service that leaves packages on the front porch without knocking on the door to get a signature. The TCP delivery service loses fewer packages, but the UDP delivery service is faster, because packages can get dropped off even if no one's home to sign for them.
What is buffering?
Streaming media players load a few seconds of the stream ahead of time so that the video or audio can continue playing if the connection is briefly interrupted. This is known as buffering. Buffering ensures that videos can play smoothly and continuously. However, over slow connections, or if a network has a great deal of latency, a video can take a long time to buffer.
What factors slow down streaming?
On the network side:
- Network latency: A variety of factors impact latency, including where the content that users are trying to access is stored.
- Network congestion: If too much data is sent through the network, this can degrade streaming performance.
On the user side:
- WiFi problems: Restarting the LAN router, or switching to Ethernet instead of WiFi, can help improve streaming performance.
- Slowly performing client devices: To play videos takes a good amount of processing power. If the device streaming the video has a lot of other processes running or is just slow in general, streaming performance can be impacted.
- Not enough bandwidth: For streaming video, home networks need about 4 Mbps of bandwidth; for high-definition video, they will likely need more.
How can streaming be made faster?
Streaming is subject to the same kinds of delays and performance degradations as other kinds of web content. Because the streamed content is stored elsewhere, hosting location makes a big difference, as is the case with any type of content accessed over the Internet. If a user in New York is trying to stream from a Netflix server in Los Gatos, the video content will have to cross 3,000 miles in order to reach the user, and the video will have to spend a long time buffering or may not even play at all. For this reason, Netflix and other streaming providers make extensive use of distributed content delivery networks (CDN), which store content in locations around the world that are much closer to users.
CDNs have a huge positive impact on streaming performance. Cloudflare Stream Delivery leverages the Cloudflare CDN to store video content across all Cloudflare data centers around the world; the result is reduced latency for short video startup times and reduced buffering.