The current killer app on a lot of Internet-connected PCs is peer-to-peer file sharing.
These programs make each user's PC a 'server' from which they can share files
to the network. Users can make search requests for particular files to the network and
download desired files directly from other users' PCs.
Usenet's alt.binaries set of newsgroups is essentially a file sharing service using
Usenet's messaging service for transport. Users settled on a common structure for
exchanging files: the subject line contains the file name and a description of the file,
and the body of the message is a binary file coded into ASCII or whatever character
system that part of the network is using.
I've been using Usenet since before Napster was even thought of. All I ever really use
it for is the alt.binaries groups and the program I've been using to get binary files
tells me that I'm averaging about 3.75 GBs of downloads per day lately. Thanks to
Usenet I'm knee deep in just about any kind of media I could want: Kung-fu movies
not released in North America and episodes of Macgyver and Red Dwarf that don't air
anymore. I can watch my favourite music videos any time I want. If I miss Friends or
Survivor... I'm not worried because I know I'll be able to get it from Usenet and
watch it at my leisure. It's freed me from the tyranny of the television
networks; but it doesn't seem to get as much press as this new-fangled peer-to-peer
business. Is there a good reason why?
To request a file on a peer-to-peer system, you enter a search string and it queries
the connected hosts and returns a list of hosts serving that file. You select a few
of the hosts and the downloading process begins. This is pretty handy because you can
search for a specific file that is currently being shared, and the results are returned
very quickly. However, the downside to this is that you have to have a specific file
in mind before downloading anything, so you won't be able to 'discover' some new media
you are interested in. Some of these programs have the ability to see other users'
search strings, but they are unorganized and undescriptive.
On Usenet files are not stored permanently, and in many cases are only available for a
week after they have been posted; you cannot 'search' for a specific file you are
interested in. However, you can make a request to the appropriate newsgroup, and
if another user of that group has that file and is willing to upload it, they
will do so. The process is slower on Usenet than it is in peer-to-peer because
it relies on human interaction. Someone will actually have to read the request
and then manually begin the uploading process. The delay between making a
request and actually receiving files could be days or weeks depending on a few
variables. Hopefully anyone answering your request would notify you ASAP via email.
However, an advantage of this process is that all the users of the newsgroup will
benefit from the request because they will all see that post and those who are
interested in it can download it. In this way, someone using alt.binaries.horror
could discover a new horror movie they'd never heard of because someone else requested it.
In a peer-to-peer system, you are downloading files straight from other user's PCs, and
the speed of a download is directly related to the upload capacity of the sender.
Obviously, that download will have to contend with all the other traffic on the
sending PC. Keep in mind that upload capacity for a T1 is 1.544 Mbps and cable modem
uploads are usually capped somewhere between 12 Kbps to 40 Kbps. Compared to Usenet,
peer-to-peer downloads seem pretty slow.
On Usenet, you are downloading files from a server at your ISP, so your download
rate is basically related to your maximum download capacity. For instance I get
just barely under 2 Mbps all the time from Cogeco's Usenet server.
The issue is a tiny bit more complicated than this though; it's almost just a matter
of perception. If I am awaiting a specific file from a specific user, using either
system the bottleneck bandwidth is still the sender's upload capacity; there's no
possible way the file is going to arrive any faster than that, in either case. Also,
in both cases the data is going to traverse a number of hosts across the Internet
before it reaches it's destination. So if you look at it this way, the speed of
both systems is the same.
On the other hand, both systems have cases where this not is not exactly true.
Peer-to-peer clients can increase throughput by downloading different parts of
the same file from multiple users in tandem. That's pretty damn clever. The
more users you find sharing the same file, the faster you can download it,
surpassing any single user's upload capacity. Also, on Usenet files that you were
not specifically searching for arrive at your ISP's server all the time, and if you
decide to download them, then the upload capacity of the sender is a moot point since
you were not waiting for the files in the first place.
In peer-to-peer systems availability is pretty spotty. The network itself seems
fairly stable, but hosts can go up and down all the time since users can turn off
their PCs, turn off the file-sharing service, or do any number of normal user-type
activities to break the connection. In some cases p2p clients have returned results
that just never seem to be available. Also, results can be returned but a user
could already be at their maximum number of uploads so you will have to wait in
queue. However, I have managed to complete many lengthy and time-consuming downloads
over p2p, so sometimes the availability is adequate.
Availability on Usenet is all dependant on your ISP's server and your connection
to it. In my case, with Cogeco, this has been really solid in the few years I've
been using it. In general a server at your ISP will be far more reliable than some
host across the Internet. The Cogeco servers do die inexplicably (seriously, I've asked
tech support and they have no explanation) from time to time, but they run a number
of redundant servers to serve different areas and in the event of a failure I've been
able to connect to a backup with no problem.
For either of these systems to be useful, they must have a certain minimum of
participating users. If they only had 5 users, the chances of each user having files
the other users would find useful is very low. Eventually all the pertinent files would
be traded and the network would die from lack of new blood. It's a lot like requiring
a large enough genetic pool to sustain a healthy population growth, or needing a certain
amount of Uranium-235 to make an atomic bomb detonate. In this case, the fuel is
participating users, and both systems have an established user base, and can be said
to have reached 'critical mass'.
In The End...
Each system has it's own deficiencies and advantages. However, after examining these
applications, I find it amazing how similar a protocol from the early 80's is to a
brand-spanking-new killer app. They both generally provide the same services, but are
from different eras entirely; Usenet obviously patterned on traditional 'client server'
models where large mainframes did all the footwork for weaker, slower terminals
and P2P comes from the current distributed 'servant' model where every node is both
server and client. Even more amazing is the fact that Usenet isn't even a file sharing
protocol; it has only been adapted to that use by the user community. It was designed from
the ground up as a discussion system and, believe it or not, is still used as such. User's
just started attaching files to messages and came up with some standard subject line policies
to identify attachments and it became a file sharing utility just as powerful as current
The biggest advantage of P2P, and the reason it gets all the attention, is general
ease of use. You want something, you type it in and hit enter and that's pretty much it.
Usenet will probably remain restricted to more proficient computer users because of it's
slightly tougher learning curve and less intuitive and less user friendly interface.
One of the reasons that P2P has not 'killed' Usenet file sharing is because it did
not overcome a major shortcoming of the system. The issue for both systems is the
ratio of contributing users to 'lurkers' or 'leeches'. A contributing user is
someone who provides files to the network while a leech is someone who simply
downloads available files while contributing nothing. I am a leech for instance,
and I am part of the problem. The theoretical reward for putting forth the effort /
bandwidth of sharing files is that other users will share their own files in return.
However, there is no requirement for sharing files and there is no penalty for not sharing.
I can download just as much as a contributing user.
There are file sharing programs in the works that work on credit systems: the more
you share, the more you can download. If this kind of program were successfully
implemented, I could see contributing users migrating to this kind of system because
they would actually benefit from sharing files and they wouldn't be taken advantage
of by the 'leech' community. This would also benefit the file sharing community as a
whole because it would contribute to the 'critical mass' of the system. The more that
is shared, the more useful the system becomes to users, and more people join, and
share more files! Of course if the file sharing community migrated to such a system
Usenet would still continue as a discussion system.