A couple days ago, Tobold wrote a post on tiered pricing for Internet access, caused by the rise in BitTorrent traffic. Essential, he took the position that heavy users of the Internet, who transferred more data, should pay more than light users. The resulting firestorm caused him to delete the post and all the attached comments. It's sort of a pity he had to do that, as it is an interesting topic. It's not directly related to WoW, but as we play WoW over the Internet, how the Internet is structured is important to us. (Plus, I haven't written anything in a while, so here's some content.)
For the purposes of this discussion, let us assume that all file-transfer or BitTorrent traffic is legal and does not infringe copyright. This isn't true, but I feel that moral arguments about copyright infringement are a distraction, and obscure the real issues at the heart of this problem.
A lot of people say that if a company says you have X amount of bandwidth, say 512 Kbs up/down, the company should allocate that amount for each customer, and not punish you for using the full amount. The problem with this view is that it is supremely wasteful. It's like you and your neighbour each having a private road from home to work. The vast majority of the time, both roads will be empty and just taking up space. A much better solution is a common road that everyone shares.
Network traffic is like cars on that road. The important part here is that an individual doesn't really care who else is on the road, so long as he can get from point A to point B in a fast and efficient manner. As more and more cars appear, the road becomes more congested, and it becomes harder to use the road to full effect. In the past, whenever this happened, ISPs would add more bandwidth to the system, essentially adding an extra lane to the road, spacing out the cars once again. (Unlike real roads, extra bandwidth is often the cheapest solution.)
So why don't ISPs just continue adding bandwidth? The answer lies in the nature of BitTorrent, which is the major protocol used to transfer files these days.
BitTorrent is a very aggressive protocol. It basically uses all available bandwidth, saturating your connection. This is one of the reasons that downloading with BitTorrent is so fast. To go back to the road analogy, it's like the road was suddenly packed full of trucks, taking all the available space. If you're in your car, trying to get on the road to go to work, this is very frustrating. Adding bandwidth doesn't help in this case, because the BitTorrent trucks will immediately fill up the new lane.
Anyone who's tried to play WoW at the same time that several torrents are running understands this. WoW takes very little bandwidth. It's playable on a 56K modem. But add several torrents downloading in the background, and your WoW connection craters. God help you if you want to run Ventrilo as well.
There are essentially two solutions to this problem: a technical solution, and an economic solution. Like all solutions, neither one is perfect.
Technical Solution - Quality of Service
The technical solution is something called Quality of Service. Basically, each type of traffic has a priority, and higher priority traffic gets transmitted first, while lower priority traffic gets delayed until the network is free.
Using the road analogy, it's like the road is full of trucks, but as soon as you pull up to the entrance, a space automatically opens up for your car, and you get shifted into the fast lane immediately. It doesn't really matter that the rest of the road is filled with trucks, you get to your destination quickly.
My personal priority system would look something like this, from highest to lowest priority:
1. Game traffic (low size, needs high responsiveness)
2. Streaming audio/video (moderate size, needs high responsiveness)
3. General web (low size, moderate responsiveness)
4. Email (low size, low responsiveness)
5. File transfers (high size, low responsiveness)
People who want to transfer files via BitTorrent can still do so, but without interfering with other people's web experience. Quite honestly, there will be a large amount of bandwidth still usable for file transfers, especially at off-peak hours.
There are several issues with Quality of Service. It is a bit expensive to implement across the entire Internet. There needs to be common agreement on the priority scheme. The network neutrality fanatics will be upset. Some bright MBA will probably think it's a good idea to prioritize by source or destination, and charge for increasing your priority.
As well, this method will decrease the average speed of file transfers. I think the increased responsiveness of all the other types of traffic more than makes up for it. However, someone else will disagree, and make a file transfer client that pretends to be the highest priority. That will lead to an arms wars between ISPs and file-transferrers as the ISPs develop new methods (Deep Packet Inspection, etc.) to classify traffic, and file-transferrers try to fool those methods.
Economic Solution - Metered Pricing
The other solution is to charge people according to the bandwidth they use. This essentially causes people to decide what uses of the Internet are important to them, and implement their own priority. I suspect that most people will cut down on file transferring, and spend their money on web surfing and email.
This is a good solution because it's fairly easy to implement, very hard to evade, and will almost certainly work. It also maps to what people think is "fair": people who use the service the most pay the most, and people who use it least pay the least.
The problem with the economic solution is that there are a lot of interesting ideas or applications that rely on people having access to extra bandwidth at negligible cost. For example, if metered pricing had been the norm, I don't think things like podcasts or YouTube would exist. Similarly, digital distribution of games or movies would have very little chance of taking off. Downloading patches for games and software becomes expensive.
There are also a lot of implications for open source. To a large extent, open source software relies on being able to easily transmit changes and updates across the Internet. Metered access puts a significant cost on using and creating open source software, which would be a shame.
As well, metered pricing can provide a disincentive for the ISPs to improve their service and increase the bandwidth available. To a certain extent, this depends on the competition available, but many ISPs in the United States seem to operate in a quasi-monopoly fashion.
Network congestion caused by BitTorrent and other distributed file-transfer systems is a real problem. Trying to ignore it, or getting into unrelated arguments about copyright infringement, will not work.
My personal preference would be for the ISPs to implement a decent Quality of Service system (with WoW and other games at the top, naturally). However, I lack faith that the ISPs will remain source/destination neutral, and only prioritize on traffic type. I also lack faith in file-transferrers, and I am pretty sure that instead of accepting slightly-reduced file-transfer performance for better overall performance, they will trigger an arms war by attempting to fool the Quality of Service systems.
The Quality of Service solution essentially requires a degree of cooperation between all parties, and I don't think that's likely. So we will probably end up with some form of metered pricing.