View Single Post
Old 23-07-02, 07:54 PM   #2
alphabeater
Registered User
 
Join Date: Jul 2002
Location: uk
Posts: 97
Default

it's true that modern computers have more capacity than is actually used. peer-to-peer, as i understand it, was designed to go some way towards solving this from the start. instead of everyone downloading a file from a central source, one user downloads it from another who in turn had downloaded it from someone else. this system relies almost entirely on the users of the p2p network to redistribute content.

altnet and other distributed computing networks (freenet is an example) are designed to distribute content to its optimum positions within the network - where it can be accessed by the people who are most likely to want it. this is hidden from the user so that they cannot interfere with it - instead of the users owning the network, the network tries to predict the needs of the users according to their previous behaviour.

'You might choose to allow Altnet to use your processing power and bandwidth during the night to render movies for an animation studio. Altnet will install a tiny application on your machine and each night will send you a package or raw data to process into video. While you sleep, your computer renders the video, deletes the raw data and sends the video back to Altnet.'

this doesn't indicate what i've seen of altnet so far at all. it seems to currently be a system for sending awful drm files around (play this rubbish game for 10 minutes!/listen to this cruddy wma song once and then buy it!). like shareware, only worse. on top of that, the results are given precedence over other results - golden icons, faster return times, placement at the top of the results. they wanted a way to integrate ads into the p2p network itself, and altnet is it.

entirely optional and configurable distributed computing, opposite to altnet's style, could have some place in a community-based p2p network. it would need to be made clear what was being done, how it benefits the network, and the user would have to be allowed to control when it happened and how much hard disk space or other resources this caching of content would be allowed to use.

if the engine for working out which content to cache was good enough, however - 'the user has downloaded some of this album, get the rest of it too for the cache', for example, or a recommendation engine similar to kazaa's current one which downloaded its listed files to the cache - and then the download was 'trickled' in the background at times when the bandwidth wasn't being used, then users could start to find instant downloads, or at least very fast ones if only part-files had been cached.

imagine if you downloaded the first two songs from an album, left the computer idle, and then came back later and decided to look for the rest - only to find that they'd already been cached for you, and were almost instantly within your reach. or if you had five songs by an artist, and they released a new single, which was found, put into the program's recommendations list, and trickle-downloaded for you when you were idle, using your spare bandwidth and hard disk space, just in case you decide you want it later.

this kind of p2p artificial intelligence (if you like), with emphasis once again that it would be completely optional whether you enabled it or not, as well as removable at any time (in case you needed that disk space back), could allow for faster downloads from p2p networks than currently exist.

while i thoroughly despise altnet, i agree that the idea of distributed computing via p2p has some interesting implications for a non-commercial community.
alphabeater is offline   Reply With Quote