Freely Redistributable Software across the Internet - Current practice and future directions to overcome the bandwidth crisis.
UKC, University of Kent, Canterbury, UK
(Full text available)
The Internet offers what must be the best available medium to track, contribute to and distribute free software. Unfortunately, as soon as national boundaries are crossed, the Internet has never really been capable of transmitting such large bodies of software within a reasonable amount of time. This has led to the installation of large dedicated archives whose sole purpose is to build up repositories of this software wherever there is the demand. Initially such archives of software may have been kept up-to-date by manually sending magnetic tapes back and forth. This process was later automated as international network links were established. It is at this point that most archives in operation today have stopped. The bandwidth crisis that is currently making large parts of the Internet virtually useless is just what is needed to spur on the next generation of technologies to make electronic information and software dissemination more effective. In this paper we take a very brief look at the history of archive sites. We then present in more detail the differences and evolution in traffic flow that we have observed from our FTP archive and through our National World-Wide Web caching service. This is followed by a discussion of a proposed European wide scheme to make FTP archives more effective by co-operative effort and finishes with a preview of software currently under development at HENSA Unix which addresses the problem through a new, more flexible and scalable approach based on a combination of caching and mirroring.
- Depositors only (login required):