Sometimes I wonder if people really understand how peer-to-peer file sharing works. I listened to the CRTC discussions and I am following some of the “post-game analysis”.
Some of the rhetoric is just plain silly. Frankly, I think that there are a lot of outspoken people who might want to stop to listen and think before speaking out.
Like the call for a class action lawsuit that I read about on Tech Media Reports:
Yet Rogers disclosed that the upload speeds are actually identical for P2P traffic as they are both throttled equally. I find that extremely deceptive and can’t help but wonder whether they open themselves up to a class action lawsuit by Extreme subscribers who don’t get the service they think they do.
I’m no lawyer, but it seems to me that before someone sues someone else, you need to have someone who experienced some harm.
Let’s go back to look at the first principles for what is meant by P2P file transfer in order to understand why there is no harm to the uploader.
Most consumers subscribe to an asymmetric internet connection. That means that it works faster in one direction than the other. For most of us, we receive information (download) faster than we can provide it (upload).
So, lets say that I want to download a file that you have. If I get it from you directly, the fastest I could possibly receive it would be based on your upload connection speed, regardless of my faster download speed. I might get frustrated because I subscribe to a fast download capability and I might even blame my ISP, even though the real problem is that the source of the traffic is what is slowing down the transfer.
So, either you could upgrade to a very expensive symmetric internet service, or I might find a creative way to get the file faster. Many of the types of files that I want to get from you (music, movies, courseware, etc.) are sitting on other computers, not just yours. So some bright minds thought that I could get the file twice as fast faster if I got a few pieces from you, and a few pieces from another person at the same time. After all, you may only be uploading at say, 640 kbps and my download capacity is 8 times that. So, if two is better than one, why not try to get 8 streams working at once, or even more.
The P2P file transfer software tries to find people from around the world who have the file that I want and gets them all to provide pieces so that we all max out the upload and download capacity of our pipes. And, as soon as I get the first piece of the file, my computer gets identified as a potential source for other people.
You get the picture.
The software, by design, is supposed to keep the pipes full.
When I use the peer-to-peer file sharing application, my purpose is to have access to files found on other people’s computers. While I may altruistically want to contribute the files on my computer to the global pool of access points, I am trying to imagine a situation where I have a real concern about whether other people can get my files at full speed versus any kind of upload constraints that are imposed by my ISP.
After all, by design, the software will find other computers that are also sources for the file, so the person downloading from me isn’t harmed either.
I understand frustration by users in having an ISP manage download speeds. That could add to the time it takes to receive the whole file. But even those more aggressive management techniques don’t impact real-time applications like streaming video or other bulk file transfer protocols such as direct point-to-point downloads, such as those used by most companies.
P2P file transfer is not how you send an email or send your photos or update your blog or file your term paper, or do real time streaming. What is wrong with managing P2P upload speeds to ensure the proper operation of the rest of the applications – for you and everyone else served by your ISP?