Netflix video delivery architecture, as described here, is made up of three parts:
the user client (app/browser/smartTV/etc)
the backend, using AWS since 2008 after Netflix's attempts of building their own data center's backfired with a corrupted DVD delivery dbse. They have three AWS datacenters: one in North Virginia, one in Portland Oregon, and one in Dublin Ireland
the [[CDNs]] using their own system that's based in ISP's infrastructure called Open Connect – OC Appliances (OCAs) are installed free for qualifying ISPs and also put in Internet Exchanges (IXPs) which are further up the network hierarchy. Some OCAs have the full Netflix library ('large peering location'), some just a small selection ('small peering location') that every day is progressively cached to include films you are more likely to watch.
"A reason Netflix gave for choosing AWS was it didn’t want to do any undifferentiated heavy lifting. Undifferentiated heavy lifting are those things that have to be done, but don’t provide any advantage to the core business of providing a quality video watching experience. AWS does all the undifferentiated heavy lifting for Netflix."
Scalable Computing is with EC2 Scalable storage is with S3 ("In 2013, the video catalog for Netflix was over 3 petabytes")
Netflix uses both DynamoDB and Cassandra for distributed databases
Netflix encodes all its video in AWS on as many as 300,000 CPUs at one time
The video arrives from the production company in a high definition format 'that’s many terabytes in size'. Netflix then:
validates, checking for artifacts, color changes, or missing frames that may have been caused by previous transcoding attempts or data transmission problems.
is put into the media pipeline. "More than 70 different pieces of software have a hand in creating every video" - starting with breaking the file into chunks so it can be processed in parallel quicker. Then the chunks are validated and then reassembled.
The reassembled file is validated again. At the end the video should be transcoded into multiple versions, collectively called its 'encoding profile':
for devices. Netflix supports 2200 devices and the transcoding is optimised for each device.
for different network speeds.
for different audio for different audio quality level
for different languages
with different subtitle files Because Stranger Things was shot in 8k, "it took 190,000 CPU hours to encode just one season" creating 9,570 different video, audio, and text files.
"In 2011, Netflix realized at its scale it needed a dedicated CDN solution to maximize network efficiency. Video distribution is a core competency for Netflix and could be a huge competitive advantage." "Since Netflix forecasts what will be popular tomorrow, there’s always a one day lead time before a video is required to be on an OCA. This means videos can be copied during quiet, off-peak hours, substantially reducing bandwidth usage for ISPs."