Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
News

P2P Developers Stand Up To Intel 46

Simone submits this article about two different visions of peer-to-peer computing - one from Tim O'Reilly, and one from the Intel corporation. (O'Reilly expounds further in his column). Watch and cheer from the sidelines as the mega-corps jockey to control the buzzword standards process, turn it into a useless mush, and are surpassed by protocols that work.
This discussion has been archived. No new comments can be posted.

P2P Developers Stand Up To Intel

Comments Filter:
  • by Anonymous Coward
    Offtopic but dully interesting.. Notice that they say:
    Developers said they preferred an open discussion like those conducted on Slashdot.

    Does anyone see something slightly odd about the fact that the zdnet author goes to the bother of giving you the stock symbol for Microsoft when they offhandedly reference MS (just in case you don't know exactly who they are!) but assumes blindly everyone who reads the article would know what slashdot is? Don't even give the address or an <A HREF>...

    Either this is oversight, they thought an explanation would be unweildy, or zdnet is displaying an amazing bit of self-awareness of the fact that the only time anyone reads their site is when slashdot links 'em. Hmm.

    Please do not moderate this post up past score:0.

  • by Anonymous Coward
    I mean, what exactly are Napster and Gnutella that makes them something new and revolutionary?

    It lets ordinary users participate as part of the internet instead of simply viewing it in a way which is simple enough for the average user to understand.

    The amount of wasted computing power, disk storage and bandwidth wasted by the current model of computers being read-only views of the internet can be transformed into part of the internet where its resources can be harnessed by others.

    If the Internet evolves to the point where everyone can be used to replicate and process information, software vendors won't have to worry about getting huge amounts of bandwidth to do software distribution, researchers won't have to worry about buying time on the local supercomputer to do valuable research, and maybe we can finally stop polluting the planet creating needless amounts of computer hardware which become obsolete every six months.

    Napster didn't create distributed computing, it simply provided a way to do distributed file sharing and brought that technology to the masses. If Napster is allowed to evolve past a simple file sharing application, maybe the dream of a single global computer will become a reality.
  • Being present at IDF's before. Intel used this control methodology with I20 (stillborn IO processor scheme) and is currently using it with the Infiniband development. This will not work for software technology!!!!
  • Napster is a software program. It uses a company as a host for purposes of procreation. It uses the host to secure nutrients (i.e., cashflow) for itself and its prodogiy. But don't mistake Napster the software for Napster the company (host). Napster is a success whether or not the host body lives or dies.
  • Take a look back now and then to be sure it's still there.
  • This P2P stuff is the devil's work! Behold!

    vexorg has quit (Read error: 54 (Connection reset by peer))

    Don't say I didn't warn you!
  • I think the P2P community is pretty much behind the IETF (or IETF-like) approach to this work. It is great to see a group of "Geeks" stand up to the corporations trying to take over everything involving the Internet.

    Chuck Wegrzyn
  • Get what? That the first meeting of the P2P wg was more like a show of "power" by a few corporations that want to take control and direct the P2P community? That there was a pretty universal agreement that we wanted an open meeting and working group? What didn't you get?

    Chuck Wegrzyn
  • ZDNet has a script they run that puts a stock symbol link after the first reference of large well-known companies. Slashdot is a piece of a piece of a not-as-well-known company, so I suspect the name-to-ticker mapping isn't in their script's database. It appears that VA Linux isn't in there either, because ZDNet stories don't create the VA Linux stock quote [zdii.com] link, even when the story is about VA Linux [zdnet.com]!

    As for the lack of a link to the site, even an AOL user will know to try adding .com, which works. Like we need more of the great unwashed around here...
  • This post is buzzword compliant.

    P2P, e*, B2B, B2C, i*, ad. nauseum

    - technik
  • Yeah, I found it interesting that they include 'server to server' and 'server to PC' in their 'temporary' definition of P2P. Um, isn't that all the Internet is? Server to server: SMTP, NNTP to name a couple? And pretty much everything else is 'server to PC'. Yeah, lets include that in our definition of P2P and just have the whole of the Internet be P2P. Sheesh! Another idea: lets patent the word P2P!

    Cheers...
  • So, we have here a definition that makes servers talking to PC's a P2P relationship. . . hmm. . . don't ftp, and http, and nfs fall into this category? Of course they do.

    This does not bode well.

    "Washington (AP): Earlier today, Al Gore, creator of the Internet, launched a lawsuit against the P2P working group for copyright infringement..."

  • I don't think Intel even knows what 'peer' means. In any context where one entity is a peer, all entities are peers. So the phrase 'peer-to-peer' is redundant.

    Also, 'P2P' has been long used in networking to mean 'point to point'. Using it to mean 'peer to peer' will be a source of unnecessary confusion.

    So let's just be concise and call this very old, well known and useful concept 'peer computing'.

  • Amen, brother. I've been at 2000 Las Vegas Interop show when they have announced the formation of ASP organization, whatever they've meant by that. I"ve spent several hours listening their speeches. Never heard or seen more confused presentations, noone had any clue what that was all about, but everyone was hyped to the max. One of the organizers was Jim Metzler, rather miserable fellow with very little knowledge but lots of ambitions, oh well, just don't want to think of it anymore.
  • I saw in the article a mention that some of the developers said they prefer an open forum similar to that of slashdot. Now, this is all well and good, but wouldn't something like sourceforge or usenet be more appropriate?

    I don't want my peer-to-peer network client randomly requesting pictures of Natalie Portman pouring hot grits down her pants.
  • Te current bottleneck with Peer-to-Peer technology is bandwidth not computing power.

    Peer-to-Peer (i refuse to use the acronym) moves the bandwidth consumption for all those big machines connected directly to the Internet Mainbone to a bunch of small machines, most of them connected to punny 56K (minus noise because of lausy telephone lines) connections.
    Where before you had

    Big pipe &lt--&gt 1000 Small pipes

    you now have

    1000 Small Pipes &lt--&gt 1000 Small pipes

    an average of 1 inbound and 1 outbound connections for each, meaning half the (already small) bandwidth.

    This will eat up your bandwith, not your computing power (in fact it will speed up your computer - you will have to wait twice as long for the things, meaning more free cycles to do other usefull things like encoding MP3s).

    Peer-to-Peer will more likely be a boom for high speed ISPs than for Intel.

  • Tim O'Reilly stood up at the meeting and said exactly the same thing, something along the lines of 'Does anyone else find it ironic that Intel is trying to handle the formation of standards for a highly decentralized technology in such a centralized manner'... (sorry if I didn't get the quote quick right, but that's the jist of it)

    You've also hit on a key issue with P2P computing -- defining just exactly what is and what is not "P2P". A large portion of what people are trying to bill as P2P is just distributed computing rewarmed...

    I think one of the final results of the P2P rush may be that people realize that most of the "new P2P" ideas are just old DCE ideas, and that very few are actually distinctly P2P. I don't know if P2P specific apps will be overwhelmingly useful or interesting enough to create a huge market, but if nothing else, the P2P craze will create a resurgence of interest in Distributed Computing apps.
  • Peer-to-peer computing is so new that no one is even attempting to define it. P2P could be servers talking to servers, servers talking to PCs, PCs talking to PCs, or WAP phones talking to all of them.

    Oh yes, peer to peer computing is so new that you can't define it. We've never seen a peer to peer based system [tuxedo.org] before. Nope, never. [ohio-state.edu] Who would do a thing like that [peer-to-peer.com]?

    All snideness aside, we all know that peers are equal -- This is the most important piece of information to understand a peer-to-peer system. Whether it is a social system in which everyone is potentially equal, but various peers decide who will listen harder to who, thereby defining the balance of power, or a networking protocol in which the same thing happens, the concept of a peer [dictionary.com] is the same. We see it in video games, in packet radio systems, and in our social lives.

    No central server? Is every client a server, and every server a client? Is there no central control? Must be peer to peer. (If it looks like a duck, and walks like a duck...)

  • Of course we all know its total BS (i.e. one of the main benefits of P2P is that it is virtually free.. how does anyone expect to make money off that??)

    Exactly! I mean, just look at how terrible the Linux companies are doing!
    They're practically starving that poor penguin!

    *ducks and runs*
  • With all due respect, you seem to have completely missed the point.

    The more standards there are, the more trouble and inconvenience is caused. For example, look at the plethora of different standards for graphics files, many of which have more-or-less identical functionality but are, nevertheless, incompatible. Annoying, isn't it?

    If we had the same situation with P2P then you would find that, eventually, a few common standards would emerge. This would be a natural result of everybody's desire to be P2P compatible with everybody else (otherwise, what's the point?), meaning that as soon as one standard is perceived to be more popular, EVERYBODY would start using it.

    This method of standard formation leads to inefficiency because:

    1) There is a good chance that the standard(s) that everybody adopts isn't the best one. (eg MP3)

    2) If you start off backing the wrong standard, then you've lost a lot of money. (Did you buy Betamax?)

    3) Because of the risks involved with uncertain standards, investment in the technology will get off to a slow start.

    Point 2 is the reason why the industry is busy trying to sort out the standards NOW, before we hit these problems. IIRC, Sony and Philips agreed to adopt common standards, for pretty much these reasons, when they realised that they were developing rival versions of the Compact Disc.

    Of course, there is another reason why, in this case, the industry wants to set the standards: they won't develop the best one, they'll develop the most profitable one. IMNSHO it's best left in the hands of the geeks.
  • Unless my browser is crapping out, you might want to close the tag in that puppy because I get a blank page in my browser and that seems a likely suspect. That and the .txt suffix, unless you want to pull the HTML out or use tags. But the source reads well.
  • The TITLE tag, that is.
  • The trouble with P2P is that its very nature precludes the establishment of all-encompassing standards. Any one peer can invent standards and that standard is only as successful as the peer who invents it is in convincing other peers to host sofware that can consume or provide information via that standard.

    Put simply, in the P2P network, for better or for worse, peers are not required to subscribe to a certain set of massivly popular transmission standards (e.g. HTML) or even protocols (e.g. ftp, http), but can adopt the standard or protocol of the group or moment they want to belong to. The feasibility of an institutional attempt to standardize P2P around a single protocol or standard is strongly in doubt even in a niche data domain, though there are occassionally benefits to such standardization.

  • Now unless I have been living in an alternate universe for the past year, Napster is not successful. They've spent millions of dollars obtaining lots of eyeballs, but the one thing the dot-comm massacre has shown us is that eyeballs do not necessarily transform into revenue

    Why does something have to be a money machine to be considered a success? Napster is MASSIVELY successfull. It makes no money. Unfortunatley the US is dominated by people who think selfish greed and the pursuit of money are the only measure of success. Your comments simply make me sick. Wake up and realize you are a citizen not a consumer, a member of a community and not just an employee. P2P filesharing will be very successfull, and will make no one money. Good.

  • Are you having a conversation with yourself? Re-Read my comment. Either you are incapable of seeing another paradigm or Dont Get It(TM).
  • ...much of the time the best protocol -doesn't- win.

    Sometimes you by Force overwhelmed are.
  • Non-disclosure agreements would be discouraged. Companies would have to prove their own copyrights and prepare to subject their intellectual property to the rules of standards bodies.

    "This is like an unincorporated trade association," Nied said. "We want something very simple."

    "Simple" to "Corporate-Think" is a contract that can be handled in an afternoon by two lawyers who went to school together. "Simple" to "Free-Think" is "stay the hell out of my living room, and me and my buddy will write something that'll knock off your socks".

    The culture gap in what is increasingly the United Corporations of America, is almost out of the Twilight Zone by now. With something as ... well, personal as "person to person", this attitude at Intel will fly just about as far as a dead pigeon with lead weights tied to its feet.

  • that standards for peer-to-peer computing might be set in a particularly non-peer-to-peer method? Or perhaps not ironic at all, if the big corporations consider peer-to-peer a threat.

    On the other hand, no one even knows what peer-to-peer computing means, so the idea of creating standards for an amorphous concept seems a bit tenuous. Is file transfer considered "computing"? And is the DNS system already peer-to-peer computing? How about the distributed stuff, such as the Seti@home?


    --meredith
  • Why are the girls sluts and the boys not bastards- when they are both doing this p2p GF thing?
  • The P2P community has realised that its "now or never", and are making sure they get what they want.They hwave learnt their lesson well from previous experience and will not allow the biggies to get away with the whole pie this time.
  • by Anonymous Coward
    The W3C's effort was a lost cause the day Andreesen ran off to the left coast with the Mosaic code, closed it, and starting saying 'mine' with regard to the WWW.

    And it didn't help when Microsoft started doing the same thing awhile later.
  • Uh hummm ...

    Tom gave a rather interesting talk at a BioInformatics Open Source Conference (same mob that's into www.bio{perl|java|xml|python}.org and generic tools [bioinformatics.org] for hacking the genome) a month ago where he did discuss some of the relevance of peer-peer. The essence of peer-peer is an basically lack of centralised control (something that quite rightly annoys corporations) and dynamic reconnectivity (create new services by adapting old). Since I was there, I've scribbled down a transcript of his talk [netscape.net] which may be of some interest (caveat ... it's released under OpenContent but Tom should be given right of first proof to make sure I didn't take down his words in vain :-) so treat it as rough working notes until then). Basically we had the old point-point connectivity (think 1-1 e.g. ftp) of the old days, then the client-server paradigm (think 1-n e.g. http) currently. Now we have an arbitrary n-n connection pattern where the programming style is not as clear. Different services have different patterns of usage and new protocols/frameworks are currently being explored like BXXP [bxxp.org]. However, the value proposition is not gated communities (aka portals) but how many other groups find your services valuable (ie commons). You can't churn users through a limited set of data portals (cough*hotmail*cough) and influence/restrict their movement. Remember the basics of commerce is built upon the premise of an economic good which is excludable and rivalable and peer-peer sorta tweaks that model quite seriously (hard to stop another peer replicating your "stuff"). This becomes a little more interesting when you're trying to search a couple of hundred terabytes of gene annotations, ESTs, microarray data, etc. as you want to combine both completeness (to maximise success) and minimal covering set (to save costs).

    Why are the big names interested? As ever, they want new drivers of growth (notice the PC market is becoming saturated). As for the buzzword du jour crowd, well that's what a cluebat is for :-).

    LL

  • hehe, good point...
    ----
  • I think its great. It gives the suits some justification to throw money at technology, including geeks like you & I. Of course we all know its total BS (i.e. one of the main benefits of P2P is that it is virtually free.. how does anyone expect to make money off that??). But as long as the buzzwords attract venture capital and big IPOs, who cares?
    ----
  • The reason Intel, Cisco et. al. are interested is because P2P adds value to their hardware. A large portion of Intel just exists to create neat software that require more powerful server and client hardware. So Intel may indirectly make money off P2P, just like they made money off B2C, B2B, ASP, etc - by providing the underlying infrastructure.
    ----
  • Why does something have to be a money machine to be considered a success?

    Napster is a company which has been given millions of dollars in venture capital in the hopes that it will reap profits for its investors. If Napster fails to give the investors a return on their investment then it has failed as a business, whether lots of people use it is immaterial, just ask the investors in CDNow [yahoo.com], DrKoop.com [yahoo.com], Pets.com [yahoo.com], etc.

    Second Law of Blissful Ignorance
  • I find it interesting...the article I had from Business 2.0 [business2.com] open on my desk when reading the linked article by O'Reilly runs much along this vein. Alas, since it is published in the 10/24 print issue and not yet published on the Web, I can't link to it.

    It's more interesting to me to watch the consortia-standards group fight than it is to worry about any single fight over any standard. If there's anything that watchers of IT should know, it's that if technology standards stink, we'll ignore them. You would think that businesses would eventually kowtow to standards groups, as consortia-derived standards tend to have a shorter shelf life--driving costs up as a new standard must be developed. Independently-developed standards tend to have greater market acceptance and therefore a longer shelf-life--eventually driving down marginal cost and increasing profits.

    Why the companies that are out creating consortia can't realize this is beyond me...


    --
  • Of course they should structure their comittee like the W3C.

    After all, look how good a job the W3C comittee does at convincing MS and Netscape of the value of standards!

    Standards work well for technical concepts where it's really in the best interest of everyone to do it the same way, and everyone realizes it.

    But for new concepts where companies already have their own buzzword-compliant concepts to make them money at the expense of the general industry, standards efforts are doomed to fail!

  • If the corporations don't all get behind a standard the standard languishes [ie: firewire and memory stick] or dies [ie: Betamax]. Pay to play is the way big corporations think. It's the wrong way to do this and I am glad to see a few intrepid and fearless souls like Tim stand up. But... if we don't get a protocol 3com, Intel, and The Evil Empire [Microsoft] all support wholesale, it could die on the vine.

    Don't get me wrong: a protocol just needs support, and not necessarily corporate support. Corporate support would just make the protocol available in a more widespread, fait-acomplis sort of fashion. It's ultimately users that decide protocols.
  • I don't think they do. Say it with me, now -- "PEER TO PEER". It describes a network of coequals. The people to whom this notion appeals are not going to like a Oligarchy governing structure. "We love the idea of a peer-to-peer network of equals, and that's why we have to rule you like a king to build it!"
    -----------
  • by JSBiff ( 87824 ) on Saturday October 14, 2000 @06:20AM (#706588) Journal
    That the internet has been Peer-to-Peer since its inception. The definition of a peer might change, but quothe the article:

    Peer-to-peer computing is so new that no one is even attempting to define it. P2P could be servers talking to servers, servers talking to PCs, PCs talking to PCs, or WAP phones talking to all of them.

    So, we have here a definition that makes servers talking to PC's a P2P relationship. . . hmm. . . don't ftp, and http, and nfs fall into this category? Of course they do. The truth of the matter is that the suits-and-ties-and-power-lunches-and-buzzword crowd are just now discovering the nature of the internet. . . That any computer can talk to any other computer (provided it has an assigned IP). So what all this P2P "buzz" is really about is just what high-level software protocol standards we'll use to facilitate peering different machines in a given situation. E.g. how will "my" e-commerce server talk to your credit-card validation service server to know that the customer can or can't purchase from me.

    I do agree with you though that this is a lot of people spouting hot-air right now.

  • by VAXman ( 96870 ) on Saturday October 14, 2000 @08:10AM (#706589)
    I know you understand what P2P is; perhaps you don't fully realize how it is different from the old-school internet model is.

    The fundamental difference is obsoletes the "mainframe style" of computing which Sun thrives (i.e. of selling a few high powered servers) and instead has a lower-powered server at everyone's home.

    See why Intel loves this?

    Intel's fundamental business plan is two-fold: to get more people to use computers, and to get more people to use more powerful computers.

    Intel is active in bridging the digital divide, making computers easier to use, and more accessible, attempting to achieve the first goal. But they are also active in trying to get people to use more powerful computers, and they are constantly developing new software technologies in order to demonstrate the need for more powerful computers. P2P is a glorious example of this.

    I think Intel is imagining a world where every internet technology is peer-to-peer, such as e-mail, file sharing, and web technologies. It is fundamentally different from the current model because it gets rid of the powerful centralized server, and requires everyone to have a server.

    Many people don't realize this, but internally, Intel considers Sun to be much more of a competitor than AMD. P2P is the one technology which has potential to make Sun irrelevant, because it makes big, centralized servers irrelevant, and replaces them with Intel desktop machines.
  • by Mike1024 ( 184871 ) on Saturday October 14, 2000 @04:33AM (#706590)
    Aha, but when will Intel be developing a A peer-to-peer girlfriend [bbspot.com]?

    Michael

    ...another comment from Michael Tandy.

  • by Anonymous Coward on Saturday October 14, 2000 @04:10AM (#706591)
    Peer-to-peer computing is so new that no one is even attempting to define it.

    Or so old and bloody obvious that no-one before Intel thought that they could get away with making up a fancy name and marketing it.

  • by Hobbex ( 41473 ) on Saturday October 14, 2000 @05:17AM (#706592)
    Both Tim O'Reily and Intel are just blowing bullshit here. There is no such thing as "P2P", at least not in this new definition of it (that is the definition that they aren't sure what it is yet), and everybody except the press, the bullshitters, and the venture capitalists know it.

    I mean, what exactly are Napster and Gnutella that makes them something new and revolutionary?

    They are simply search engines (and rather bad ones compared to say Google) that can handle a high volatility of the hosts where the data is located. And the only reasons why this high volatility is present at all is that:

    a) ISPs are doing their best to keep people from having static address where they could put normal servers. This partially because of the lack of IPv4 addresses, but mostly in order to maximise profits by having users that need servers require more expensive accounts.
    b) The data they are being used to distribute is illegal and so non-volatile sites carying it are hunted down.

    So, a) is caused by corporate stupitidy at the ISPs (many of whoom, I'm sure, will say they are P2P supporters now that it will help the stock price), and b) is because of the legal stupidity also known as copyright law (which has it's biggests supporters in companies like Sun, IBM, and Intel, all "P2P working group founders" according to the ZDnet story). There is no revolution here, just a clumsy workaround for a completely unnecessary problem. What these programs offer is simply the chance to download stuff that has been hunted off the web by lawyers working for the same corporate structure that now thinks "P2P" will be the next big money extorting machine for them...

    I am one of the three or four lead designers/coders of one of the networks mentioned in the article, and I honestly couldn't give a rats ass how this working group is formed, or who controls it - at least until they can actually define what the fuck they are in control of, and what it is that makes this something new different.

    (The project that I am a part of, Freenet, is trying to do something new, but what we are trying to achieve is only of the very surface connected to "P2P" as the money crowd sees it. And it will take years before we are ready (the Internet was given 25 years before the money crowd realized it was a prime rape target - why can't we have a tenth of that?))
  • by Carnage4Life ( 106069 ) on Saturday October 14, 2000 @04:53AM (#706593) Homepage Journal
    Is anybody else tired of all these Three Letter Acronym buzzward fads that for the past year or two come long every few months and are proclaimed to be the greatest invention since blah and will become a market worth billions of dollars only for a few companies to IPO successfully just to tank a few months later.

    Frankly I can't see how P2P is any less a meaningless buzzward filled, destined to fool a lot of investors, make a bunch of young CEOs rich, unproven, hype filled technology. Every time I open an issue of Fortune or Forbes I see some fool going on about how P2P is the next big thing and how Napster being so successful confirms this. Now unless I have been living in an alternate universe for the past year, Napster is not successful. They've spent millions of dollars obtaining lots of eyeballs, but the one thing the dot-comm massacre has shown us is that eyeballs do not necessarily transform into revenue. This reminds me of all the dotcomm evangelists who used to claim that B2C was the way to go because Amazon was so successful (even though they are yet to turn a profit and spend $150 million a year just to service their debts).

    Second Law of Blissful Ignorance

Force needed to accelerate 2.2lbs of cookies = 1 Fig-newton to 1 meter per second

Working...