P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 25-05-02, 08:32 AM   #1
Yaroslav1979
 
Posts: n/a
Default Search List for Kazaa - consecutive autosearch of rare files

Hi everybody who uses FastTrack network!

It is very cool that somebody cracks Kazaa and tries to improve it.
I wrote many emails to kazaa support with ideas how to make it much better,
but of course they don't even read it.
The idea is the following.
I am (like many people) a fond of a kind of rare music which does sometimes exist in
FastTrack network, but to find it, one has to repeatedly search for it many,many
times 24 hours a day, to finally succeed.
I believe it is very easy to develop a program which reads a file with my
keywords to find, then puts text into search field, presses search button,
waits for the 'disabling' message for stop button (which indicates that the
search is finished), then selects all results and presses download button! and then continues with the next search item.
It is that easy but I have never written Win32 software.

Please think of this GREAT idea!
I believe that millions of people will be happy to have such intelligent
tool, which allows to find hundreds of extremely rare files over one night!
  Reply With Quote
Old 25-05-02, 08:51 AM   #2
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,018
Default

hi Yaroslav1979, welcome to nu! enjoy your stay . i hope you'll hear from someone soon about this.

- js.
JackSpratts is offline   Reply With Quote
Old 25-05-02, 09:06 AM   #3
StereoDeluxe
 
 
StereoDeluxe's Avatar
 
Join Date: May 2002
Location: HAL9000
Posts: 283
Default

hi Yaroslav1979

Have you tried AudioGalaxy? best place for rare MP3, IMHO.

welcome to NU.
StereoDeluxe is offline   Reply With Quote
Old 25-05-02, 09:53 AM   #4
BuzzB2K
Just another cat on the FastTrack...
 
BuzzB2K's Avatar
 
Join Date: Jan 2002
Location: Hamilton
Posts: 727
Default

Hi Yaroslav1979, Welcome to NU
Glad to see you made it over here You will no doubt find a few individuals here who are still working to improve the FastTrack experience...

(And the Admin of the MorpheusX board is a semi-regular poster here too!!)

Last edited by BuzzB2K : 25-05-02 at 12:21 PM.
BuzzB2K is offline   Reply With Quote
Old 25-05-02, 01:48 PM   #5
TankGirl
Madame Comrade
 
TankGirl's Avatar
 
Join Date: May 2000
Location: Area 25
Posts: 5,587
Wink

Hi Yaroslav1979 and a warm welcome to Napsterites Underground!

I think your idea is good and you already even laid out a neat implementation for it. Let's hope some of our resident hackers catches it!

- tg
TankGirl is offline   Reply With Quote
Old 25-05-02, 05:10 PM   #6
theknife
my name is Ranking Fullstop
 
theknife's Avatar
 
Join Date: Dec 2001
Location: Promontorium Tremendum
Posts: 4,391
Default Re: Search List for Kazaa - consecutive autosearch of rare files

Quote:
Originally posted by Yaroslav1979
Hi everybody who uses FastTrack network!

It is very cool that somebody cracks Kazaa and tries to improve it.
I wrote many emails to kazaa support with ideas how to make it much better,
but of course they don't even read it.
The idea is the following.
I am (like many people) a fond of a kind of rare music which does sometimes exist in
FastTrack network, but to find it, one has to repeatedly search for it many,many
times 24 hours a day, to finally succeed.
I believe it is very easy to develop a program which reads a file with my
keywords to find, then puts text into search field, presses search button,
waits for the 'disabling' message for stop button (which indicates that the
search is finished), then selects all results and presses download button! and then continues with the next search item.
It is that easy but I have never written Win32 software.

Please think of this GREAT idea!
I believe that millions of people will be happy to have such intelligent
tool, which allows to find hundreds of extremely rare files over one night!
well, it's not quite as sophisticated as all that,Yaroslav1979, but for relentless FastTrack searches, you may wanna try this
theknife is offline   Reply With Quote
Old 25-05-02, 05:11 PM   #7
Dawn
R.I.P napho 1-31-16
 
Dawn's Avatar
 
Join Date: Dec 2000
Location: Venus
Posts: 16,723
Default

Hi from me too
__________________
I love you napho and I will weep forever..........
Dawn is offline   Reply With Quote
Old 26-05-02, 01:08 AM   #8
Wanker
Registered User
 
Join Date: May 2002
Posts: 9
Default

It sonds like a good idea, but when you think technically that ain't that great anymore.
Here is why: the system is not central server based, but peer-to-peer, so searching is done actually by the set of peers called supernodes. Every search uses network resources: CPU power of the supernodes and bandwdith. While CPU is not a problem as desktop computers are as powerful as servers these days, bandwidth is a major problem as regular home user has something like 256Kbit cable, while servers can have hundreds of Mbits bandwidth.
If you use some common sense, lets say when using Kazaa average user would do 10 searches a day, then when using automated searching this number would be 10-100 times higher. This means that bandwidth used will be 10-100 times higher and home users can't certainly sustain that and whole network will collapse.
Same is the problem in Gnutella network, that is why early version of Xolox was banned by other clients. And best example is Edonkey where Edonkey bot fucked up the network royally so they had to pull it.
Bottom line is that as the network is shared and resources are limited which means that every additional resource you use is taken away from other users. So, don't be selfish and be a responsible p2p citizen
Wanker is offline   Reply With Quote
Old 26-05-02, 12:20 PM   #9
jaan
Registered User
 
Join Date: May 2002
Posts: 39
Default

wanker (no pun intended ),

again, excellent post. that's also the reason IMHO, why we won't see open source networks rivalling the quality of closed source ones any time soon. if the source is open, someone somewhere will "pull a donkeybot" on the network, bringing it to a screeching halt.

- jaan
jaan is offline   Reply With Quote
Old 26-05-02, 01:16 PM   #10
hda12
Registered User
 
Join Date: Apr 2002
Posts: 9
Default

Jaan,

this is a strong argument for closed source. But this model has imho more disadvantages. If a court orders Kazaa to shut down the network, no one can stop the company to do this.

But open source protocol networks like gnutella can't be stopped by a court order.
hda12 is offline   Reply With Quote
Old 27-05-02, 03:04 AM   #11
colinmacc
Registered User
 
Join Date: Mar 2002
Posts: 45
Default sabotage?

...but wouldn't open source networks be more open and vulnerable to sabotage if the RIAA starts going down that route (unofficially, of course..)

I can just see the RIAABots on the march now clogging up all the open networks with their virtual cyberlawsuits...
colinmacc is offline   Reply With Quote
Old 27-05-02, 05:30 PM   #12
Wanker
Registered User
 
Join Date: May 2002
Posts: 9
Default

Good points about closed source, but there are several points to consider.
It is true that it harder to write bots and other hacks for closed source system, but it is not impossible. Edonkey bot is an example, Morpheus-X another one. And it is aways possible to hack P2P client. By closing source security you get is pseudo security, it is still possible to do all the same things that can be done with open source, except it takes a bit more time.
Another point is that Gnutella is open source and there is nothing majorly wrong with the system
colinmacc's argument is a good one too: nobody owns Gnutella and that means that RIAA has nobody to sue

Of course I understand that Kazaa won't open their source as it's harder to make money on open source product. Ad displaying would be first thing that gets removed in open source Kazaa
Wanker is offline   Reply With Quote
Old 29-05-02, 09:42 AM   #13
jaan
Registered User
 
Join Date: May 2002
Posts: 39
Default

Quote:
Originally posted by Wanker
By closing source security you get is pseudo security, it is still possible to do all the same things that can be done with open source, except it takes a bit more time.
well, time is all you need, really. if it takes a lot of time to reverse-engineer the closed source product and distribute the hack, then the hack won't get very far, because the original product can be updated to block out the hack.

Quote:
Another point is that Gnutella is open source and there is nothing majorly wrong with the system
true, gnutella is an example of a very robust network that is hard to sabotage (because it is effectively composed of fairly independent subnets). however, every performance enhancement (such as ultrapeers) will decrease its robustness factor. hence my point that it is difficult to rival closed source nets performance-wise.

- jaan
jaan is offline   Reply With Quote
Old 29-05-02, 03:17 PM   #14
SA_Dave
Guardian of the Maturation Chamber
 
SA_Dave's Avatar
 
Join Date: May 2002
Location: Unimatrix Zero, Area 25
Posts: 462
GrinYes Regarding original suggestion...

I like this idea, but it needs improvement! It would be convenient to schedule unattended searches. Time-zone differences play a large role in the availability of certain files. However, even with filters enabled, the results would be overwhelming in some cases. How are you going to avoid the 'the file you selected is currently downloading...' message, as well as poorly-renamed duplicates and junk files? It should be up to each user as to what is queued, so I'd prefer that a database of UUHash data (little disk space needed) was maintained locally, so that startfiles of specified results could be created later. I don't believe that too much slowdown of the network would occur, as Wanker suggested. The FT network was designed to accomodate this : more sources are constantly being searched for as you download and this is why the use of MorpheusX hasn't caused total bogdown (as was the case with eDonkey and the bot). Especially now that Kazaa 1.7's 'search more' feature has been introduced, even if you do 10 searches a day, you may in reality be doing 50-1000 searches in total (depending on the size of the files, availablity, bandwidth etc.) I don't believe Kazaa would've implemented this if it would impact network performance (not that I'm using that version anyway. )

The search function of FT clients is fundamentally FLAWED IMO. This proposed bot could alleviate some current issues, but ideally the entire experience needs to be altered. Here are my desired improvements :
[list=1][*]Improved Options For Customisable Searches : I want to be able to create new categories or add file-types to previously defined categories. For example, I could add .rm to the video search. This feature would be particularly useful for rarer file-types. It's really frustrating when you get no results for a specific search, but get 100+ irrelevant/offensive results in a general search, even with filters enabled! I hate then having to add 20 or more keywords to be filtered in order to even catch a glimpse of a relevant result. [*]Improved Search Engine : The current search engine is far too limiting! It only returns results for exact matches, with the exception of '_' which represents a space or underscore (but this is logical!) Boolean searches would be nice. It would also be nice if some sort of relevancy algorithm were used (like on internet search engines) to account for misspellings etc. Wildcard characters shouldn't necessarily be supported, as this could increase search traffic & cpu use of supernodes. Another nice feature would be auto-sorting of search results by category, much like the ability to categorise files in many download managers. [*]Ability To Queue Offline Files : This feature would make me go I started using Napster in September '99 and ditched that buggy, overrated piece of bloatware in early December. Why? I found an obscure link on Lycos to a site that changed my life! In case you didn't know, AG is a great place for rare music, because it maintains a db of all files shared in a 30 day period. This eliminates the need for repetitive, all-day/all-night searches and greatly decreases search traffic. The only disadvantage is that queues are server-based and self-refreshing every 30 days. It would be great if supernodes could maintain a quasi-database of all files shared by the peers that connected to it in a certain timespan. How could this be done? The SN could maintain a db of sigs, each requiring about 1Kb of disk space. About 1000 sigs per Mb, so superpeers with little disk space would need a greater refresh rate or the ability to disable this feature. However, each SN has a maximum number of peers, which are also limited by geographical location. This limits the db size and as hard drives get bigger and cheaper it should be easier to implement. An added benefit is that since each peer connects to 2 or more SNs, this provides a rudimentary RAID structure to the database if a SN goes offline or doesn't have enough hdd space to maintain an accurate db. This database might not be as readable as AG, but as hosting websites via p2p becomes more common it might be possible for the supernode/s to serve a site for a web-style search. The 'qeueud' files are peer, not server/SN, based in this model. The search results should preferably display different icons for offline/online files.[*]Improvements To Interface : Like in iMesh, it would be nice to switch to search or traffic views while offline. I like to see what's downloaded overnight and I sometimes do a search before bedtime, to queue at my leisure in the morning. It would save bandwidth if I could preview files while offline from within Kazaa and perhaps cancel transfers without opening each .dat file in my favourite hex editor & deleting them manually. [*]Increased Incentives To Share : The 'disable sharing' option has to go! The option is too easy for even Newbies to find and people use it because it's there! Again using Audiogalaxy as an example, you cannot download more than 1 file at a time unless you're sharing at least 25. Also, many of its users don't know how to use explorer or even what .mp3 files are! Making the program easy to use for newbies (i.e. media capabilities, having to share to download a lot etc.), while transparently sharing as much as possible, is how file-sharing should be. Perhaps a more intelligent bandwidth control is needed, to encourage those with caps to share. See also next point.[*]Prioritisation Of Transfers : This might be difficult to implement & would probably have to be written to the .dat file itself (like the "paused" status). The reasons are clear : you want the rare files to be downloaded ASAP (d/l priority), but the uploader might also have spent ages getting the file. The uploader should have the choice to give queue priority to the person whose sharing (u/l priority). If some sort of db was maintained (point 3), you might be able to prioritise based on quality, rather than quantity of shares, according to your own criteria. This is related to above point. [*]Giving "SuperPeers" More Control : The ability to deny users sharing specific content, send messages to users about viruses for example, maintain themed db's (point 3) or user bases in Direct Connect fashion etc. This would allow SNs, not a parent company, to control the network while maintaining the ease-of-use and transparent features for standard peers. [/list=1]

Sorry for the long post, but these are the features that I believe would make a FT (or OpenFT) client almost perfect.
SA_Dave is offline   Reply With Quote
Old 29-05-02, 06:09 PM   #15
twinspan
- a rascal -
 
twinspan's Avatar
 
Join Date: Mar 2002
Location: for security reasons, never the same as the President's
Posts: 759
Default

Quote:
Giving "SuperPeers" More Control : The ability to deny users sharing specific content, send messages to users about viruses for example, maintain themed db's (point 3) or user bases in Direct Connect fashion etc. This would allow SNs, not a parent company, to control the network while maintaining the ease-of-use and transparent features for standard peers.
Tin-pot dictators on FastTrack? Noooooooooooo!

All other points but the last would be the thin end of the wedge. I like FT coz it behaves (to the end user) like logging onto a single network, not begging for admittance to a hodge-podge of fiefdoms.
__________________
Your prompt response is requested.

Respectfully,

Mark Weaver,
Director of Enforcement
MediaForce, Inc.
(212) 925-9997
twinspan is offline   Reply With Quote
Old 29-05-02, 07:44 PM   #16
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,018
Default

hi SA_Dave and welcome to nu! you may have just won the award for "Most Smilies In a First Post". no doubt you'll be hearing from nu netizen dawn.

- js.
JackSpratts is offline   Reply With Quote
Old 31-05-02, 02:40 PM   #17
SA_Dave
Guardian of the Maturation Chamber
 
SA_Dave's Avatar
 
Join Date: May 2002
Location: Unimatrix Zero, Area 25
Posts: 462
Default

I may not have posted what I meant in point 7! I shouldn't have used the word "deny". What I meant was, that SN's should be able to decide what files are maintained in its database. For example, I could 'host' anime episodes, pix & mp3's for example. If a standard peer wanted software, I would redirect them to another SN with full access to that content. However, if a user only wanted anime-themed content, I'd only search my peers & any other SNs that have similiar content. This wouldn't mean that any peer is banned as such & in this way it might improve network performance if you share a diverse collection of files (you connect to even more SNs) as well as limiting search traffic of those who want specific content. It might also provide a filtering mechanism before the search results are returned. I don't think the direct connect style of private, invitation-only, or strict-requirement groups will work. These "groups" would remain public, but it should be done in a transparent way. If my db idea were implemented, a SN could also link to specific files hosted by other SNs without content preferences, thus limiting search results to specific files only. This would be extremely difficult to do, I know, but it would be a nice feature.

Another limitation of the current interface, is that you can't highlight multiple files (shift + ctrl functionality like in explorer?) in the traffic view. It can be tedious to cancel a multitude of files, which were queued with only one click!

I apologise for the smilies. I just got carried away I suppose and didn't realise this was a problem?!?
SA_Dave is offline   Reply With Quote
Old 31-05-02, 04:04 PM   #18
twinspan
- a rascal -
 
twinspan's Avatar
 
Join Date: Mar 2002
Location: for security reasons, never the same as the President's
Posts: 759
Default

ahhhh, that's what you meant, Dave.

<twinspan wipes remaining foam from around his mouth at the mere suspicion of 'ops' on his beloved FT>


and about the smilies, don't worry. heheh, first post, we've all been there (well I have, at least. to anyone who hasn't: ). But you'll have a job trying to usurp Dawn from her Throne of Smileyness.
twinspan is offline   Reply With Quote
Old 31-05-02, 07:18 PM   #19
TankGirl
Madame Comrade
 
TankGirl's Avatar
 
Join Date: May 2000
Location: Area 25
Posts: 5,587
Wink

Quote:
Originally posted by SA_Dave
I may not have posted what I meant in point 7! I shouldn't have used the word "deny". What I meant was, that SN's should be able to decide what files are maintained in its database. For example, I could 'host' anime episodes, pix & mp3's for example. If a standard peer wanted software, I would redirect them to another SN with full access to that content. However, if a user only wanted anime-themed content, I'd only search my peers & any other SNs that have similiar content. This wouldn't mean that any peer is banned as such & in this way it might improve network performance if you share a diverse collection of files (you connect to even more SNs) as well as limiting search traffic of those who want specific content. It might also provide a filtering mechanism before the search results are returned. I don't think the direct connect style of private, invitation-only, or strict-requirement groups will work. These "groups" would remain public, but it should be done in a transparent way. If my db idea were implemented, a SN could also link to specific files hosted by other SNs without content preferences, thus limiting search results to specific files only. This would be extremely difficult to do, I know, but it would be a nice feature.
Hi SA_Dave and welcome onboard - enjoy your stay at Napsterites!

An excellent post and a pregnant idea which resonates well with my own thoughts on developing the connectivity and scalability of p2p networks. The ideas of supernode architecture and multisourcing have taken p2p to a certain point but this point is far from satisfactory both contentwise and socially. Instead of providing us a full connectivity and full search visibility among millions of concurrent users all today's decentralized networks open up just a very limited part of peer space for our social presence and searches. My gut feeling is that I see and am seen by perhaps 10.000 other peers on WPN and my last experiences of FastTrack were rather similar. Instead of having the present 1 % visibility in a million-strong p2p community we would really like to have 100 % visibility - to have working hotlists and communication channels to all our friends (like the instant messangers have) and to reach with our searches - even if slowly - the entire shared content of the network. Considering how fast the p2p communities have been growing the design target for the next generation 'decentralized Napster' should really be around 10 million connected peers. That's perhaps 4 magnitudes from where we are now so there is plenty of room for architectural innovation.

Your example demonstrates one way to let the content of the network to shape its connection topology further from the initial 'random' distribution of the supernodes and peers subject to them. The generic goal of getting specific content and people who like it to gravitate closer to each other in the network topology can also be achieved in a more autonomic, self-organizing fashion. The supernodes can use the information of successes and failures of long-distance (supernode-to-supernode) searches and related downloads as a guide in reorganizing their respective peer populations. It makes a lot of sense for a reggae lover to find his/her way gradually closer to other reggae lovers - under the same supernode or into the same group of effectively clustered supernodes capable of linking the interest/genre group together. The beauty of the self-organizing scheme is in that it would not be bound to any predefined genres or themes or fixed supernode hosts. It would rather clusterize any peers who like what they find from each other's libraries, whatever name they want to give to their common interest. And as a peer, all you would have to do to get closer to better and more interesting peer sources would be just to keep searching, sharing and downloading the stuff that you personally like most...

Very few of us are limited to just a single genre interest and some of us are very eclectic. A content-shaped topology would probably work best with a more multidimensional supernode architecture than the present exclusive one. Rather than being connected to a single supernode in a single content cluster you would probably want be connected to a number of supernodes, each serving and representing a particular genre/community interest. The subjective experience of this would perhaps be like having a house in cyberspace with a number of doors opening to very different streets with different local communities and content pools on each of them.

Quote:
Originally posted by SA_Dave
I apologise for the smilies. I just got carried away I suppose and didn't realise this was a problem?!?
Do not apologize! Your first post was brilliant both content & smiliewise!

- tg
TankGirl is offline   Reply With Quote
Old 31-05-02, 11:27 PM   #20
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,018
Default

the smilies were a riot. loved 'em dave!

- js.
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump






All times are GMT -6. The time now is 10:43 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)