Thursday, 5 November 2015

Let's do the math: Put Comcast's "Data Plan" into perspective

Definitions

Before I begin, let me explain what certain terms will mean. I will be using Cisco definitions:

Bandwidth: The amount of data that can be sent through a given communications circuit

Throughput: The maximum rate at which none of the offered frames are dropped by the device

By these definitions, I will use bandwidth to describe the quantity (e.g. Gigabytes) and throughput to describe the transmission speed (e.g. Megabits per second). Connections are measured in bits per second, whereas data storage is measured (usually) in bytes.

Make /r/theydidthemath Proud

Sorry mobile readers

Let's assume you've got a contract for a connection speed of up to 10Mbps and you have a 30-day billing period.

10 Megabits = 10 x 1000 Kilobits = 10,000 x 1000 bits or 10,000,000 (10.000.000 in EU?) 30 days = 30 x 24 hours = 720 x 60 minutes = 43200 x 60 seconds or 2,592,000 

So,

10,000,000 bits per second multiplied by 2,592,000 seconds = 25920000000000 bits of data 

Breaking that into human-readable units,

25,920,000,000,000 bits ÷ 8 (bits per Byte) = 3,240,000,000,000 Bytes ÷ 1024 (Bytes per KiloByte) = 3,164,062,500 KiloBytes ÷ 1024 (KiloBytes per MegaByte) = 3,089,904.78515625 Megabytes ÷ 1024 (MegaBytes per GigaByte) = 3,017.485141754150390625 GigaBytes ≈ 3 TeraBytes 

Now, Comcast's "Data Plans" grant each person up to 300GB of transferred data before charging $10 per add'l 50GB. If you were to be lucky enough to actually hit full data speed and were constantly downloading data, you would only use 10% of the total possible bandwidth before being charged overage. Your overage would be 2,700 GB, or (2,700 ÷ 50 = 54, 54 x 10) $540.

However, this does not include upload speeds. Due to the nature of how computer communicate on the network, there must be data being sent out to request/acknowledge data coming in. If you had a 1 Mbps upload speed, we can calculate the total usage to be equal to 10% of the download usage, or ~300GB. That would be another $60.

Let's (try to) be realistic

There's no way in Hell that the average user will use their network at 100%, 24/7 for a month. There's downtime, downtime, and away time. You'll have some technical difficulties, time to sleep, and time out of the house.

Let's assume a family of 4: 8-hours-a-day-5-days-a-week father, the I'm-always-home-never-able-to-have-fun mother, the OMG-XBOX-LIVE-LOL teenager, and the Textaholic-social-media-addict-14-year-old daughter.

Dad is on the internet for 2 hours each workday night, 4 hours per day over the weekend He catches up on football (ESPN Streaming) Reads the news (Yahoo! auto-play videos included) and checks his portfolio (Google Stocks) Mom is on the internet 4 hours a day during the week, 6 hours a day over the weekend Netflix for 2 hours YouTube for 1 hour Blog browsing for 1 hour (includes Spotify streaming) Add 2 hours of YouTube, blogs, and Spotify on weekends Son is on the internet 5 hours each day, 10 over the weekend It's a total FusterCluck of Steam, Origin, YouTube, Twitch, Skype, and Pr0n Daughter spends the same amount of time as her brother Conglomeration of Tweets, Snapchat, Apple iMsg, Vine, iPhone Apps, Spotify, and Netflix Total Devices in the home: 2 Windows Desktops 3 Apple computers 4 iPhones 1 Xbox 1 Wii The devices will download updates during the evening as necessary; the Xbox and Windows Desktop will download games overnight when prompted 

Assume that when everyone is home, the full 10Mbps is being used. Also, updates/games being downloaded will also use the full speed

2 of Mom's weekday hours overlap 2 of Mom's weekday hours are alone All 4 of dad's weekday hours overlap All 5 of the son and daughter's weekday hours overlap Everyone's weekend hours overlap Mom's 2 hours alone = 9GB The overlapping 5 hours per weekday = 22.5GB The overlapping 10 hours each weekend (Saturday and Sunday) = 90GB Total per week = 121.5GB Total per month = 486GB + 2 GB updates + 8GB new games = 496GB 

Estimated overage = 196GB or $40

Oversimplification

A lot of these numbers have been oversimplified for many reasons:

  • Streaming data is sent in chunks, rather than a continuous, literal flow
  • Not every streaming service, website connection, and download utilizes all available network throughput
    • Some services, e.g. Apple (Bonjour, Airplay, Time Machine), will hog all bandwidth possible until finished
  • Some services will cache (store locally) data if it is requested often enough by the device as appropriate; some will share with local peers or over the internet if set up incorrectly (Windows 10 Updates, advanced settings)

System updates, new applications to install, email, smartphone apps, background processes, new movies, new music, etc. accumulate quickly over a short time depending on their use, settings, and design. Thanks to Windows 10 Updates being automatic by default (inc'l peering over local network), users selecting "download automatically" for music, video, and apps on their iPhones, and kids leaving their Facebook page open in a tab, your typical user would be consuming data in bursts around the clock. The scenario I developed may seem over-the-top stupid but the numbers at the end would be in the ballpark.

Check my work

There is an online calculator that will convert units of data transfer rates and convert them into total data over time. At 10Mbps, the theoretical/maximum possible amount of transferred data would equal 75MB in a minute. Due to latency, transmission errors, timeouts, etc. we don't actually see this much data making it through over a standard home connection, but there is additional overhead used for transmission protocols that also consumes a little bandwidth.

I'm off to bed. G'night!



Submitted by KamikazeRusher Xbox 720 Release Date 2015

No comments:

Post a Comment