P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 19-07-04, 02:48 PM   #21
SA_Dave
Guardian of the Maturation Chamber
 
SA_Dave's Avatar
 
Join Date: May 2002
Location: Unimatrix Zero, Area 25
Posts: 462
Exclamation

Quote:
Originally Posted by shepdog
Mazer due to the types of license used in OpenSource such as the GPL, MIT and several others it would make basing a company based around it very difficult.
This is a common misconception. Microsoft legally ripped a lot of code from the BSDs without paying or distributing source code. You might want to consider the BSD license yourself. Basically it allows you to do whatever you want with the code, as long as you give credit where credit is due.

From Selling Free Software:
Quote:
Except for one special situation, the GNU General Public License (20k characters) (GNU GPL) has no requirements about how much you can charge for distributing a copy of free software. You can charge nothing, a penny, a dollar, or a billion dollars. It's up to you, and the marketplace, so don't complain to us if nobody wants to pay a billion dollars for a copy.

The one exception is in the case where binaries are distributed without the corresponding complete source code. Those who do this are required by the GNU GPL to provide source code on subsequent request. Without a limit on the fee for the source code, they would be able set a fee too large for anyone to pay--such as a billion dollars--and thus pretend to release source code while in truth concealing it. So in this case we have to limit the fee for source, to ensure the user's freedom. In ordinary situations, however, there is no such justification for limiting distribution fees, so we do not limit them.

Sometimes companies whose activities cross the line of what the GNU GPL permits plead for permission, saying that they "won't charge money for the GNU software'' or such like. They don't get anywhere this way. Free software is about freedom, and enforcing the GPL is defending freedom. When we defend users' freedom, we are not distracted by side issues such as how much of a distribution fee is charged. Freedom is the issue, the whole issue, and the only issue.
The GPL is not as restrictive as all the FUD-mongers and MS Fanboys would like you to believe.
SA_Dave is offline   Reply With Quote
Old 19-07-04, 04:18 PM   #22
Dawn
R.I.P napho 1-31-16
 
Dawn's Avatar
 
Join Date: Dec 2000
Location: Venus
Posts: 16,723
Default

I screwed up my welcome post shepdog. Hi anyway.
__________________
I love you napho and I will weep forever..........
Dawn is offline   Reply With Quote
Old 19-07-04, 07:10 PM   #23
TankGirl
Madame Comrade
 
TankGirl's Avatar
 
Join Date: May 2000
Location: Area 25
Posts: 5,587
Thumbs up

Hi and welcome to P2P-Zone, shepdog! It is always a pleasure to see new developers joining the p2p movement!

As for the title of your thread, you have come to just the right place to ask your questions – this is an active and savvy p2p community with plenty of collective experience and insight.

Some comments on the discussion so far:

Quote:
Originally Posted by shepdog
The new protocol I am developing will have sophisticated searchs. The ability to search on ID3V1 and V2 tags, some data matching of the actual content of the file. What I mean by this is that I've seen MP3's tagged totally differently but the conent of the file be exactly the same. Same bit rate, same length, same content. A lot of applications fail to recognize these as the same files. I want to.
Hashing is the obvious solution here. Hash trees are definitely worth considering as they allow you to verify files in small chunks, and the verification does not depend on the order in which the chunks are received. The latter property becomes important in multisourced environment where you may request things in a certain order but receive them in a totally different order.

Quote:
Originally Posted by shepdog
I also want to take this a step further as well. You might have a file that is sampled at a given rate and you want to finish d/l the file. I want to see what I can do about re-encoding on the fly to desired bitrate. What sort of advanced search features are you speaking of? I am wide open to suggestions.
This might not be a very good idea. Mixing higher quality rips to lower quality rips on the fly will just gradually degrade the average quality of the content pool and make it harder to spot original high quality releases apart from their more or less degraded versions. It is much better to try to distribute high quality releases as such, untouched, and thereby maximize the number of peers having them in the multisourced environment. If a lower quality version is needed for some reason, it should be distributed similarly untouched, so that the peers having it could form another large set of sources for downloaders.

Quote:
Originally Posted by shepdog
Encryption:
The network as I have it designed is a completly anon and secure network. I also want the users to be able to use any encryption scheme they want.

It will come with a default encryption algo, probably twofish, but you will have the ability to use anything you want, even custom developed routines. You will also have the ability to say what type of functionality you want encrypted, chats, searches, file listings, d/l's, etc. This will allow you to optimize the speed of the system.
I think Filetopia has used a similar approach but for an average user such a choice is needless and technically very challenging. Few WASTE users know (or care to know) which particular encryption method is being used in the software – for them it is important to know that the method chosen by the developer is 1) secure and 2) efficient enough to do the job. It is a good idea to test different encryption methods and compare their efficiency, but maybe this testing should be done by the beta testers, and at the end you would just pick the one that both you and your beta test team considers to be the best choice.

It would also be good for the end user if various optimisations (for speed, for search success etc.) would be as automatic as possible. If there is a set of parameters to be optimised for me to get my stuff faster, I would like the smart software to do the optimising in the background and just deliver me the goodies with the best speed possible from the network!

Quote:
Originally Posted by SA_Dave
3. Group and community features in a decentralised environment. This includes publishing/broadcasting features as in AudioGalaxy. Permanent identities (public keys) linked to rewards and improved social standing. The ability to use any web browser to browse through verified hash links hosted on the p2p network itself would be nice.
Quote:
Originally Posted by shepdog
Permanant Identities. I have been thinking on how to do this for a while and I think that I have it figured out.
Here is an interesting earlier discussion on permanent identities and related stuff like trust relations. Many important points and technical details are covered in the discussion so it is worth checking for a developer. As Dave points out, permanent and verifiable identities are needed to build any sort of sustained peer relations and social structures. When you have permanent, cryptographically strong identities, you can start building 1-to-1 trust relations, trusted groups and trusted distribution networks on them. Artists and other sources of new content need permanent identities even more than average users to be able to establish a reliable and genuine presence on the network.

- tg
TankGirl is offline   Reply With Quote
Old 19-07-04, 07:32 PM   #24
Mazer
Earthbound misfit
 
Mazer's Avatar
 
Join Date: May 2001
Location: Moses Lake, Washington
Posts: 2,563
Default

Now be quiet, you're scaring people, Maze.
Quote:
Originally Posted by shepdog
As for the Maze -> Mazer convertation I don't know what to say except, okay.
Don't mind Maze, he's been having a minor identity crisis since I joined the old Nappy forum in '00.

I think Dave's right here. Just look at RedHat, they make the most widely recognized and used Linux distribution available and they're profitable too. Of course to make an open source project generate revenue you need a brilliant business person to match your brilliant programming skills. But if what you're really trying to do is to promote and support emerging musicians, filmmakers, and artists then there's no need for the revenue stream to go through your hands, users can compensate artists directly. Your software could include a PayPal link to each artist or something like that. The point is that capitalism and open software aren't necessarily opposing forces, and sometimes they're quite complimentary.
Mazer is offline   Reply With Quote
Old 19-07-04, 08:02 PM   #25
Mazer
Earthbound misfit
 
Mazer's Avatar
 
Join Date: May 2001
Location: Moses Lake, Washington
Posts: 2,563
Default

Quote:
Originally Posted by TankGirl
Quote:
Originally Posted by shepdog
I want to see what I can do about re-encoding on the fly to desired bitrate.
This might not be a very good idea. Mixing higher quality rips to lower quality rips on the fly will just gradually degrade the average quality of the content pool and make it harder to spot original high quality releases apart from their more or less degraded versions. It is much better to try to distribute high quality releases as such, untouched, and thereby maximize the number of peers having them in the multisourced environment. If a lower quality version is needed for some reason, it should be distributed similarly untouched, so that the peers having it could form another large set of sources for downloaders.
This doesn't have to be the case if Ogg Vorbis files are used. It supports a feature called peeling: basically you encode a high bitrate file once and any lower bitrate file can then be peeled out of it without causing transcoding artifacts. Ogg also has a very flexible tagging system, and if the content creator so chose he could include information in the tag that pointed to the source of the original file along with its hash. This way if a user came across a low bitrate file and decided he wanted the high quality version he could easily find the right file to download. These are minor points in a world dominated by MP3, but if the p2p client supported Ogg peeling and Ogg tag searching then it would give artists that much more control over the way their art gets distributed.
Mazer is offline   Reply With Quote
Old 20-07-04, 08:48 AM   #26
shepdog
Registered User
 
shepdog's Avatar
 
Join Date: Jul 2004
Location: St. Louis, Missouri
Posts: 5
Default

Quote:
Originally Posted by TankGirl
Hi and welcome to P2P-Zone, shepdog! It is always a pleasure to see new developers joining the p2p movement!
Thanks TG (can I call you TG? lol). The p2p space is such an interesting problem domain and is one that I feel is the most wide open when it comes to oportunities for new development. I've been drawn to it like a moth to a flame.

Quote:
Originally Posted by TankGirl
As for the title of your thread, you have come to just the right place to ask your questions – this is an active and savvy p2p community with plenty of collective experience and insight.
This is becoming more and more apparent the longer this thread gets and the more I delve into earlier posts. You've had some especially thought provoking threads.

Quote:
Originally Posted by TankGirl
Hashing is the obvious solution here. Hash trees are definitely worth considering as they allow you to verify files in small chunks, and the verification does not depend on the order in which the chunks are received. The latter property becomes important in multisourced environment where you may request things in a certain order but receive them in a totally different order.
Wow, what a great abstract. This is just what I was searching for. Thank you.


Quote:
Originally Posted by TankGirl
This might not be a very good idea. Mixing higher quality rips to lower quality rips on the fly will just gradually degrade the average quality of the content pool and make it harder to spot original high quality releases apart from their more or less degraded versions. It is much better to try to distribute high quality releases as such, untouched, and thereby maximize the number of peers having them in the multisourced environment. If a lower quality version is needed for some reason, it should be distributed similarly untouched, so that the peers having it could form another large set of sources for downloaders.
I understand what you are saying about the degradation of quality from re-encoding. My motivation behind this was to provide more resources for slower connections to d/l files from. If they only want to d/l at 128bps and all that is available is 320bps or a vbr I could see the possibility of d/l at a lower rate. So you see 15 sources for a vbr or 320bps file but it would take you 4 hours to d/l it would be nice to have the ability to right click on the source and have it d/l at a desired bitrate. Of course these re-encoded files could be cached and tagged to point back to the original.

Quote:
Originally Posted by TankGirl
I think Filetopia has used a similar approach but for an average user such a choice is needless and technically very challenging. Few WASTE users know (or care to know) which particular encryption method is being used in the software – for them it is important to know that the method chosen by the developer is 1) secure and 2) efficient enough to do the job. It is a good idea to test different encryption methods and compare their efficiency, but maybe this testing should be done by the beta testers, and at the end you would just pick the one that both you and your beta test team considers to be the best choice.
I totally agree that simplicity is the best approach for the average user, but for power users and all the tinkers it would be nice to give them the ability adjust settings. From the programming API's perspective this will be easily controlled, maybe just not from the user interface.?

Quote:
Originally Posted by TankGirl
It would also be good for the end user if various optimisations (for speed, for search success etc.) would be as automatic as possible. If there is a set of parameters to be optimised for me to get my stuff faster, I would like the smart software to do the optmising in the background and just deliver me the goodies with the best speed possible from the network!
Totally agree here as well. The system will try to optimize all the time, and learn better optimizations over time. Read between the lines there.



Quote:
Originally Posted by TankGirl
Here is an interesting earlier discussion on permanent identities and related stuff like trust relations. Many important points and technical details are covered in the discussion so it is worth checking for a developer. As Dave points out, permanent and verifiable identities are needed to build any sort of sustained peer relations and social structures. When you have permanent, cryptographically strong identities, you can start building 1-to-1 trust relations, trusted groups and trusted distribution networks on them. Artists and other sources of new content need permanent identities even more than average users to be able to establish a reliable and genuine presence on the network.
I totally agree with your assessment. When it comes to community you would like to be able to have permanent identities based upon a hash and have a form of 'dns' for lack of a better term to be able to locate and communicate with the given identity if online. In looking at giving tools to the content providers to set up channels to allow them to offer their content as well as allowing users to reward content providers with money a thought occured to me. It would be nice for a content provider to be able to identify the user that gave money and to target that user in the future because they donated in the past. The problem with this is that if I can tie say a paypal account to a hash for this purpose it would negate the anonymous aspect of the single hash used as identity. Because if that use ever provided a file encrypted with that hash I could tie it right back to them. I was thinking of a way around this by providing a Identity public key that the user would be known by, but also the user would have a public encryption key that would be used for all encryption related transmissions to/from the user. This would allow a seperation between the public identity of user and allow them to contriibute to things they think is worthwhile and to be rewarded by that content provider as well as remain anon when they don't want to contribute to a content provider but still want to snag content. Just a thought I was kicking around in my head on the way into work this morning. Like I said earlier, I have a hard time thinking about anything else.

Thanks for your feedback and your previous postings. Very good information.

One last question to the board in general. Do any of you think you would be interested in beta testing a new p2p app around the end of the year? I've got to start thinking about it..

Thanks to all of you for all your feedback.

John
shepdog is offline   Reply With Quote
Old 20-07-04, 03:06 PM   #27
TankGirl
Madame Comrade
 
TankGirl's Avatar
 
Join Date: May 2000
Location: Area 25
Posts: 5,587
Default

Quote:
Originally Posted by shepdog
It would be nice for a content provider to be able to identify the user that gave money and to target that user in the future because they donated in the past. The problem with this is that if I can tie say a paypal account to a hash for this purpose it would negate the anonymous aspect of the single hash used as identity. Because if that use ever provided a file encrypted with that hash I could tie it right back to them. I was thinking of a way around this by providing a Identity public key that the user would be known by, but also the user would have a public encryption key that would be used for all encryption related transmissions to/from the user. This would allow a seperation between the public identity of user and allow them to contriibute to things they think is worthwhile and to be rewarded by that content provider as well as remain anon when they don't want to contribute to a content provider but still want to snag content.
Right. There is no problem with peers having multiple identities for different purposes. In your case, as the low level network topology is a lattice, you might want to use temporary disposable identities for the nodes in their role as lattice members. These would be purely technical identities without history, merits or credits - just to allow the peers to run protected sessions with each other and to refer to each other (in peer discovery, lattice reconfiguration etc) in a reliable way. In other, more social roles the peers could use more permanent identities with a history and associated merits and credits.

Instead of having a single permanent identity for all your social interactions it might be safer to have a set of permanent identities, one for each established social relation (peer contact, group membership etc). This way, if any of those identities would get lost, compromised or troubled, you would not lose your entire social position and credentials but just the compromised part of it. This sort of arrangement would also help to make the peer discovery process more secure: when searching for a given peer you would search for an identity that would be unknown or irrelevant to most other peers.

- tg
TankGirl is offline   Reply With Quote
Old 20-07-04, 04:36 PM   #28
Mazer
Earthbound misfit
 
Mazer's Avatar
 
Join Date: May 2001
Location: Moses Lake, Washington
Posts: 2,563
Default

Quote:
Originally Posted by shepdog
One last question to the board in general. Do any of you think you would be interested in beta testing a new p2p app around the end of the year? I've got to start thinking about it..
I think I can speak for everyone when I say you have dozens of eager participants ready and waiting for your prototype. Your in the p2p zone now.
Mazer is offline   Reply With Quote
Old 22-07-04, 03:25 PM   #29
SA_Dave
Guardian of the Maturation Chamber
 
SA_Dave's Avatar
 
Join Date: May 2002
Location: Unimatrix Zero, Area 25
Posts: 462
Brows

Quote:
Originally Posted by TankGirl
Instead of having a single permanent identity for all your social interactions it might be safer to have a set of permanent identities, one for each established social relation (peer contact, group membership etc). This way, if any of those identities would get lost, compromised or troubled, you would not lose your entire social position and credentials but just the compromised part of it. This sort of arrangement would also help to make the peer discovery process more secure: when searching for a given peer you would search for an identity that would be unknown or irrelevant to most other peers.
Your post has been assimilated... Don't give away too many trade secrets.
SA_Dave is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump






All times are GMT -6. The time now is 12:49 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)