Friday, June 20, 2008

Network Traffic

A couple days ago, Tobold wrote a post on tiered pricing for Internet access, caused by the rise in BitTorrent traffic. Essential, he took the position that heavy users of the Internet, who transferred more data, should pay more than light users. The resulting firestorm caused him to delete the post and all the attached comments. It's sort of a pity he had to do that, as it is an interesting topic. It's not directly related to WoW, but as we play WoW over the Internet, how the Internet is structured is important to us. (Plus, I haven't written anything in a while, so here's some content.)

For the purposes of this discussion, let us assume that all file-transfer or BitTorrent traffic is legal and does not infringe copyright. This isn't true, but I feel that moral arguments about copyright infringement are a distraction, and obscure the real issues at the heart of this problem.

A lot of people say that if a company says you have X amount of bandwidth, say 512 Kbs up/down, the company should allocate that amount for each customer, and not punish you for using the full amount. The problem with this view is that it is supremely wasteful. It's like you and your neighbour each having a private road from home to work. The vast majority of the time, both roads will be empty and just taking up space. A much better solution is a common road that everyone shares.

Network traffic is like cars on that road. The important part here is that an individual doesn't really care who else is on the road, so long as he can get from point A to point B in a fast and efficient manner. As more and more cars appear, the road becomes more congested, and it becomes harder to use the road to full effect. In the past, whenever this happened, ISPs would add more bandwidth to the system, essentially adding an extra lane to the road, spacing out the cars once again. (Unlike real roads, extra bandwidth is often the cheapest solution.)

So why don't ISPs just continue adding bandwidth? The answer lies in the nature of BitTorrent, which is the major protocol used to transfer files these days.

BitTorrent

BitTorrent is a very aggressive protocol. It basically uses all available bandwidth, saturating your connection. This is one of the reasons that downloading with BitTorrent is so fast. To go back to the road analogy, it's like the road was suddenly packed full of trucks, taking all the available space. If you're in your car, trying to get on the road to go to work, this is very frustrating. Adding bandwidth doesn't help in this case, because the BitTorrent trucks will immediately fill up the new lane.

Anyone who's tried to play WoW at the same time that several torrents are running understands this. WoW takes very little bandwidth. It's playable on a 56K modem. But add several torrents downloading in the background, and your WoW connection craters. God help you if you want to run Ventrilo as well.

There are essentially two solutions to this problem: a technical solution, and an economic solution. Like all solutions, neither one is perfect.

Technical Solution - Quality of Service

The technical solution is something called Quality of Service. Basically, each type of traffic has a priority, and higher priority traffic gets transmitted first, while lower priority traffic gets delayed until the network is free.

Using the road analogy, it's like the road is full of trucks, but as soon as you pull up to the entrance, a space automatically opens up for your car, and you get shifted into the fast lane immediately. It doesn't really matter that the rest of the road is filled with trucks, you get to your destination quickly.

My personal priority system would look something like this, from highest to lowest priority:

1. Game traffic (low size, needs high responsiveness)
2. Streaming audio/video (moderate size, needs high responsiveness)
3. General web (low size, moderate responsiveness)
4. Email (low size, low responsiveness)
5. File transfers (high size, low responsiveness)

People who want to transfer files via BitTorrent can still do so, but without interfering with other people's web experience. Quite honestly, there will be a large amount of bandwidth still usable for file transfers, especially at off-peak hours.

There are several issues with Quality of Service. It is a bit expensive to implement across the entire Internet. There needs to be common agreement on the priority scheme. The network neutrality fanatics will be upset. Some bright MBA will probably think it's a good idea to prioritize by source or destination, and charge for increasing your priority.

As well, this method will decrease the average speed of file transfers. I think the increased responsiveness of all the other types of traffic more than makes up for it. However, someone else will disagree, and make a file transfer client that pretends to be the highest priority. That will lead to an arms wars between ISPs and file-transferrers as the ISPs develop new methods (Deep Packet Inspection, etc.) to classify traffic, and file-transferrers try to fool those methods.

Economic Solution - Metered Pricing

The other solution is to charge people according to the bandwidth they use. This essentially causes people to decide what uses of the Internet are important to them, and implement their own priority. I suspect that most people will cut down on file transferring, and spend their money on web surfing and email.

This is a good solution because it's fairly easy to implement, very hard to evade, and will almost certainly work. It also maps to what people think is "fair": people who use the service the most pay the most, and people who use it least pay the least.

The problem with the economic solution is that there are a lot of interesting ideas or applications that rely on people having access to extra bandwidth at negligible cost. For example, if metered pricing had been the norm, I don't think things like podcasts or YouTube would exist. Similarly, digital distribution of games or movies would have very little chance of taking off. Downloading patches for games and software becomes expensive.

There are also a lot of implications for open source. To a large extent, open source software relies on being able to easily transmit changes and updates across the Internet. Metered access puts a significant cost on using and creating open source software, which would be a shame.

As well, metered pricing can provide a disincentive for the ISPs to improve their service and increase the bandwidth available. To a certain extent, this depends on the competition available, but many ISPs in the United States seem to operate in a quasi-monopoly fashion.

Conclusion

Network congestion caused by BitTorrent and other distributed file-transfer systems is a real problem. Trying to ignore it, or getting into unrelated arguments about copyright infringement, will not work.

My personal preference would be for the ISPs to implement a decent Quality of Service system (with WoW and other games at the top, naturally). However, I lack faith that the ISPs will remain source/destination neutral, and only prioritize on traffic type. I also lack faith in file-transferrers, and I am pretty sure that instead of accepting slightly-reduced file-transfer performance for better overall performance, they will trigger an arms war by attempting to fool the Quality of Service systems.

The Quality of Service solution essentially requires a degree of cooperation between all parties, and I don't think that's likely. So we will probably end up with some form of metered pricing.

19 comments:

  1. Here in the UK there has been a debate on whether content providers should contribute to the cost of upgrading networks. This was prompted by the launch of the BBC's iPlayer service, which allows the downloading of TV shows. See http://news.bbc.co.uk/1/hi/technology/7336940.stm for the full story.

    ReplyDelete
  2. Perhaps I'm missing something, but it seems like you're assuming that all file transfers are recreational, and people can just deal with waiting a little bit longer for their new music or DVD rip. How about people who use the internet for work and would see a significant decrease in productivity if their speed was reduced on file transfers?

    ReplyDelete
  3. One of the thing that only gets touched in is AT&T. They supposedly are some of the ones pushing for paying per use.

    But they are betting Billions (yep... Capital "B") on using their Internet capacity to the home to provide TV services, along with Internet and such.

    Do we really think they are going to throttle back (which use-pricing will provide) internet use of their TV content? When they are planning revenue streams around per-view advertising?

    I would think not.

    ReplyDelete
  4. First ... please read up a bit on the technical aspects of Bitorrent and how it works before making a post like this. Torrents do not naturally "eat up bandwidth". The reason WoW and vent run like crap when a torrent is downloading is because of the number of connections required to keep the download fast.

    Bitorrent works so well because it downloads from multiple sources at once. Even though one person's upload speed might be slow, 20 people's combined upload speeds can be very fast. But with real-world downloads we're talking about opening hundreds of connections to various clients. The problem comes in with these connecitons: most people don't have routers that can handle so many connections at once. The router slows down and as a result latency increases. WoW and Vent are both very much latency-sensitive applications (as are many internet-based applications). You computer can also be overwhelmed by the number of connections, as can your high-speed modem (many of which are just routers in disguise).

    Notice nowhere in that did I say the word bandwidth. It is totally possible for bitorrent to destroy your gaming experience and use very little bandwidth.

    Second, QoS sounds like a great idea on paper, until you come to understand that people are not honest and computers believe the data they are sent. The short version: QoS can be faked and would therefore be able to be abused.

    Third, you talk about bandwidth like a highway. Imagine a highway where you have to pay for how far you drive. You pay $50/mo to get on the highway, and then every mile you have to pay a $1 toll. Does it matter to someone that only drives a half mile a day? No, they still pay that $50/mo they've always paid. But that poor shmuck that has to drive 50 miles to work every day? Yeah, he's screwed.

    I could rant about specifics all day, but I won't. The real problem here isn't bitorrent.

    The real problem is that the US (and elsewhere I'm sure) is on an aging infrastructure. Yeah, some companies are laying new optical line but most aren't. Adding lanes to the highway might be cheap but ripping it up and starting over isn't. But the general weight of the internet is increasing, webpages themselves are much more feature-rich and content-laden. Video on the web has become a huge thing. On top of that more and more people keep connecting every and. And last but not least, most companies have to share internet bandwidth with the cable-tv they also provide which also is eating an increasing amount of bandwidth (HDTV is not light-weight).

    Companies can't do anything about any of the above. Content can't be controlled (only the speed that you can get to it), more clients is always good, and HDTV is here to stay. What they can do is take an easy-to-misunderstand technology, deamonize it for its legal implications, blame it for all bandwidth issues, and use it as another reason to charge more.

    I understand that might sound really, really cynical. But as someone who works with the internet for a living and follows the computer industry closely, I can tell you it is very much the case.

    ReplyDelete
  5. Just a quick addition. Bittorrent also uploads while it downloads (nature of the protocol). Uploads can also negatively as the number of connections can grow exponentially with every file made available, even when a person's upload speed is significantly smaller than their download; as is normally the case. Just think, downloading 3 files at 50 connections each and 3 uploads (same files we're downloading!) at 50 connections each is 300 connections and I even low-balled the numbers.

    That is a lot of cross-talk.

    ReplyDelete
  6. Jon, I simplified a lot, but my point holds. BitTorrent and other distributed clients does not consume a fixed amount of resources (bandwidth, connections, etc.) It's usage is elastic, meaning that you can't add more bandwidth or connection-handling capability to solve the problem. And only a small percentage of the population actually uses clients like BitTorrent. Once more people start using it, and techs like it, you're essentially going to see 100% usage of any network, all the time. And BitTorrent competing with the other network traffic decreases performance all around.

    All the other uses you mentioned are inelastic. Each channel of HDTV or whatever is a fixed size. You can compensate for that with extra bandwidth, which historically has always been the cheapest option.

    ReplyDelete
  7. Wendy, the issue is that for most of the other internet traffic, responsiveness, or latency is much more important than it is for file transfers. As well, file transfers tend to be one huge block of data, while surfing the web tends to be short bursts of data surrounding lots of empty space. If you can fit the file transfer into those empty spaces, it significantly improves performance for web surfing.

    ReplyDelete
  8. BitTorrent might be one the rise, but 100% network utilization is absurd. As is saying that adding more capacity can't compensate for heightened usage.

    BitTorrent is an excellent protocol and very useful in off-setting bandwidth costs from the providing companies to the customers that use them. It is however horrible at providing steady speed, consistently fast downloads, consistent availability, and has a number of other problems including security, privacy and, probably most importantly, usability. While it's use might continue to grow it will most likely stay a secondary form of distribution for many types of media. That includes video which is one of the largest issues with bandwidth right now (see that BBC news article, their bandwidth usage cause no end of issues).

    BitTorrent's communications also aren't handled any differently than others (except when throttled as per the current Comcast situation). Just like the rest of the internet it can be used as little or as much as people want to. Adding more processing and bandwidth will definitely help address the issue as it has always addressed increased demand in the past.

    I also didn't bring up HDTV because I thought it would be an issue in the future (the only time when elasticity can become an issue), that is not it at all.

    It is an issue now. There are no more lanes to add to his highway. Comcast (and Verizon and Time Warner) is already using extra video compression on many of its channels to make space for additional HDTV channels. It is rolling out a service that utilizes multiple "channels" (cable companies use a standard channel's bandwidth to measure the capacity of their networks) worth of bandwidth instead of just one to provide an even faster connection than they currently offer, further eating into the line usage. So it might not be a constantly growing issue, but it is one of the issues right now that causes them to bring BitTorrent to the forefront.

    That is my whole point:

    BitTorrent is a footnote on the bandwidth map of the internet. A blip. The real problem with bittorrent is that is the on the forefront of the new internet; one that sees normal users actually using the bandwidth they are purchasing as opposed to simply browsing those 56k-compatible websites faster. It is in murky legal waters because of illegal downloads which makes it a good target to attack. The fact that it is a complicated technology to the average person makes it an even better target.

    The entire US infrastructure got caught with is pants down and BitTorrent is the PR effigy that is being used to offset blame. Tiered and metered schemes are band aids for that broken system, and they are band aids that reward the companies for inaction while hurting the customers with high prices.

    This discussion should never, ever be about any specific consumer technology. It should be about general rising bandwidth use (not just internet) and its effects on the infrastructures in place. Anything beyond that discussion is politics plain and simple.

    ReplyDelete
  9. Adding more processing and bandwidth will definitely help address the issue as it has always addressed increased demand in the past.

    I disagree with this statement. I see a fundamental difference in the system before and after distributed protocols came into significant use.

    Prior to the rise of distributed protocols, increased demand was kept in check because you usually downloaded from a single site which had to pay for the bandwidth it used. This meant that the site limited the rate of data transfer, to keep it's costs down, and that kept the network from being saturated.

    People have an unlimited appetite for data. Every facet of computing has shown this, time and time again. RAM, hard drive space, portable media, etc. You add more bandwidth and we will simply upload and download more stuff. When all parties to the distributed protocol have fixed costs for data transfer, there is zero incentive to limit the resources used.

    This causes problems for other people trying to use the network at the same time. It's a classic "Tragedy of the Commons".

    ReplyDelete
  10. You're assuming both the majority of of data transfer will become distributed and that when I say "more bandwidth" that I'm talking about bandwidth that the normal person will see. Neither of those are the case.

    The growth of distributed protocols is limited as I've said already. It's very nature limits it. Take video on cellphones for example. Video provided by cellular providers could very easily be changed to a distributed system, especially with endpoints that are always on (lets suspend privacy and usability qualms for a second here). The problem is that no one wins in this situation. The network useage would actually increase because the content provider would always be the network provider and the end user would be hurt by long downloads and a lack of streaming. It might seem like a very specific issue but, like I said, video is a big issue right now and distributed protocols are not adiquate for handling it. Netflix's recent foray into set-top-box-based downloadable movies are another great example. Distributed data and streaming media do not currently co-exist in a single protocol and you are unlikely to see the as such any time soon. Again, just examples of a very large picture where distributed protocols will remain only a portion.

    The thing ... the bandwidth we are provided by our ISPs will continue to expand to meet demand, yes. But by orders of magnitude smaller than the expansion of backbones and routing centers (in theory; expantion has essentially halted except for Verizon's recent fibre push and even that is using already outdated line tech). Even though our appetite is unlimited, our connections will never be and the companies that sell those connections can both regulate how fast those limits rise and how well their networks can handle those limits. It is a simple matter of reinvestment over profit growth; which of course makes it a very complicated matter for the companies in question.

    So once again, I have to reiterate that distributed systems aren't currently the problem with bandwidth; they are mearly a convenient distraction from the real issues at hand. Those issues boil down very simply to the money going in and the money coming out and they are being handled the same way they always are by companies.

    ReplyDelete
  11. Rohan, your analogy is puzzly. Bandwidth is not like a road. It's closer to what shipping companies provide. I can pay a price to move X number of packets, to a destination every day. Bandwidth is essentially the same: you're paying a price to move X number of data packets somewhere every day. In fact, all data plans nowadays require contracts, just like major shipping companies.

    If my ISP is incapable of providing, to the best of their ability, all of the shipments per day that I have paid for, and therefore they are -contractually obligated- to perform, then they no longer have my business. It doesn't matter if it's recreational surfing, P2P, or business-level bandwidth. The only issue here is that ISPs has sold data plans under the assumption that people will not use those plans in their entirety. This allowed them to sell bandwidth they knowingly cannot provide. If bandwidth were business-provided shipment rates, we'd have a litigation of ISPs committing fraud.

    Increasing bandwidth capacity will work for the ISPs. Look at every other developed country in the world. They provide their customers unlimited data plans at factors of our own.

    ReplyDelete
  12. There is one fundamental part of this discussion that is missing... who gets to decide what is quality traffic and what is unworthy? Right now people see BitTorrent as the traffic that needs to be throttled, but the fastest growth segment of traffic is VOIP, voice over Internet.

    Cable providers would rather you pay for their digital phone offerings. Perhaps they will decide that VOIP is using up too much bandwidth and close that down? Cable providers would like you to use their movie on demand services, perhaps they will decide that products like NetFlix Roku player use up too much bandwidth?

    Highspeed networks in the US, especially for the home consumer, is a very closed market. You generally have one cable provider to choose from, maybe you have a single phone company that can offer DSL, and if you're lucky and in the few locations that offer, fiber internet. Each of these offerings are controlled by companies who have a vested interest in limiting where you send your traffic.

    Any limit to traffic, in an already controlled network, is against consumer interest. Yes, a few people use more than the rest and some network providers have not been able to plan for it. But the reality is, it is their responsibility to plan for it. As more and more services are available on the Internet, more and more people will use the bandwidth available. Networks have the responsibility to plan for these things.

    And if companies that want to limit bandwidth and services cannot meet the demands, perhaps they would like to again visit the discussion of opening their lines to other network providers. No reason for them to limit bandwidth if consumers could chose to use other networks for transit. It is only their greed and inability to properly plan that has forced these adhoc and self serving solutions.

    ReplyDelete
  13. One of the reasons why the world is in a state of post-Americanism has to do with the lack of vision. Every person and company believes the best situation is the higher bottom line. The only thing a tiered system would do is deter progress in technological evolution, you aptly addressed that. Why anybody would want to slow down in order to speed up is beyond me.

    Furthermore, nearly every BitTorrent client has a way to manage your bandwidth usage. Knowing your available upload and download caps from you ISP will help you immensely when you limit the BT client to access other programs using you connection.

    ReplyDelete
  14. I'm not sure I understand what all the fuss is about. I thought this issue was already managed by ISPs under fair usage terms. My own ISP, Tiscali, has no problem with me running bandwidth-hungry applications (BitTorrent included) as long as it's during off-peak times. If I insist on running my clients during peak times, I'm liable to get cut off. Using your highway analagy, traffic is only an issue when we all hit the roads at the same time.

    ReplyDelete
  15. The problem I'm worried about is how ISPs will abuse overage fees for their internet access plans. Most cell phone companies make a large portion of their profit from fees which is why ATT is so eager to try this new billing system out.

    They'll sell you package plans of so many Gigabytes a month then use a ton of legalse so that most users will accidently run into fees. I mean is there any sane reason why cell phone plans should have 3 different kinds of minutes all with different restrictions on them?

    Gaming and Tiered Internet Pricing

    ReplyDelete
  16. For the purposes of this discussion, let us assume that all file-transfer or BitTorrent traffic is legal and does not infringe copyright. This isn't true, but I feel that moral arguments about copyright infringement are a distraction, and obscure the real issues at the heart of this problem.

    Which pretty much sums up why I had to delete my first post: I went for the distraction and that ended up being too much of a flame bait. Your's is the much better write-up.

    So when are you taking the plunge and officially redesign your blog to be a general MMO plus tech blog. It appears as if your official WoW label is keeping you from writing some very good stuff. Stopping to play WoW shouldn't stop you from writing. And a redesign might be less painful than starting a fresh blog. Who says that "Blessing of Kings" has to be all about WoW or paladins?

    ReplyDelete
  17. This comment has been removed by the author.

    ReplyDelete
  18. merlot, the US ISPs tend to sell connections that have X speed. The argument then becomes that if you have X speed, you should be allowed to use X speed 24/7, regardless of "fair use clauses".

    I personally don't think the ISPs were intentionally deceptive when they determined the average speed available. Average speed is very different than worst-case speed, and depends greatly on usage patterns. Their estimate of usage did not include people running programs which could use up all the available bandwidth 24/7.

    ReplyDelete
  19. Perhaps their estimates SHOULD have included that possibility. They advertise the capacity, not the expectation. If they told people they can get 1MB download and during non-peak times there happened to be 2MB available for use, no one would complain. But instead, they use 2MB to lure people in and away from alternatives. If you advertise it, you MUST plan for it.

    As someone who's worked for a network provider, we're going to see a lot more of this soon. The next major bubble will come within the next couple years as millions of miles of fiber across the US see their leases expire. The cost of that fiber will jump, and the providers will do anything to curtail the need to lease anything for the short term. Any provider that can wait out the need to extend leases for a while will be rewarded with fiber at remarkably inexpensive cost... they just have to wait for the current lease holders to fold and pick it all up at a discount. What we are witnessing is the providers manipulating the retail markets. As consumers, we should be outraged.

    ReplyDelete