While Napster fights for survival in the US courts, its foes have not managed to quell the rising flood of similar file sharing architectures. Gnutella, Aimster, AudioGalaxy and others continue where Napster left off, taking forward the concept behind Napster`s technology - peer-to-peer (P2P) architecture.
Internet start-ups facilitating the public distribution of copyrighted material are not the only ones interested in P2P. Intel is one of the main pundits backing the architecture, forming the Peer-to-Peer Working Group, a consortium tasked with creating standards for P2P computing.
Despite big backers, P2P is often misunderstood
P2P was a major theme at Intel`s Developer Conference held in San Jose, California, last year. "Peer-to-peer computing could be as important to the Internet`s future as the Web browser was to its past," said Pat Gelsinger, VP and chief technology officer, Intel Architecture Group, at IDF Fall 2000.
"While the most visible impact of this model has been in consumer environments, P2P computing has the potential to play a major role in business computing as well. By adding P2P capabilities, corporations can tap into existing teraflops of performance and terabytes of storage to make today`s applications more efficient and enable entirely new applications in the future."
Peer-to-peer computing could be as important to the Internet`s future as the Web browser was to its past.
Pat Gelsinger, VP and chief technology officer, Intel Architecture Group
Others backing P2P include IBM, Oracle (as part of its ONE Net strategy), Hewlett-Packard, Network Associate`s anti-virus subsidiary McAfee, and P2P specialists like Groove Networks, Roku Technologies and Endeavors Technology.
However, the excitement exhibited by US vendors over P2P has yet to hit SA`s shores.
"P2P computing is currently touted as the next 'killer application` for the Internet," states a Gartner Consulting paper on the subject, yet the local IT industry lacks familiarity with the architecture.
"To me, it`s quite a new term," comments Vaughn Parkin, MD of software distributor, Workgroup. "Two peers would be a server and PC, or a PC and PC with the means of delivering a product. Application service provision (ASP) is a peer-to-peer model."
So, are ASPs a form of P2P? Chris Hogg, e-business manager, Intel UK, says: "That`s not really correct. I see the ASP as a way of outsourcing the IT department. P2P and ASPs have different objectives."
ASP cannot fall under the P2P banner due to ASP`s inherent client/server nature. Clients and servers are not peers. In its simplest form, a P2P architecture involves communication between two machines that fulfil similar roles: between two or more mainframes, two or more servers, or two or more clients.
Connecting and users in distributed environments
Within the Napster example, there was a server involved in the process, but communication between it and the clients was not the primary function of the architecture. It merely facilitated the first step in setting up a communication between two clients. In technical P2P-speak, this is known as a data-centred or data-focused P2P application.
The data-centred architecture requires a dynamic index server to identify content on client machines as they connect to the network. This information is available to all other clients on the network.
To me, it`s quite a new term.
Vaughn Parkin, MD, Workgroup
The possibility to share information and pool knowledge bases in an unstructured manner holds exciting possibilities for enterprise in the e-business space. Currently, harnessing data in distributed environments is nigh impossible. Spreading the reach of a corporate to other sources of information - such as e-business partners - is even more complex.
Data-centred P2P solves many problems facing IT departments today. Most of the knowledge in a company is typically trapped on various PCs scattered throughout the organisation; hard drive space is wasted through replication of documents, and man-hours are equally wasted through the repetition of tasks already completed. Data-centred P2P promises to alleviate all of these data-centric problems - and more.
For data-centred P2P to succeed in the workplace, security and access control to indexes must reside at both the client and server level. This is known as formal data-centred P2P, as opposed to Napster`s informal P2P, which is not governed by a formal corporate policy, and access is open to all.
A similar P2P architecture, user-centred, works in much the same way, except that its goal is to connect people, not data. Microsoft`s NetMeeting and instant messaging service ICQ are examples of user-centred P2P, and communication applications like IP telephony can be expected to deliver more user-centred solutions.
User-centred P2P is also split into formal and informal models.
Instead of contacting an indexing server to facilitate contact, a user-centred solution would typically use a directory server, such as Microsoft`s Active Directory, Novell`s directory service, or lightweight directory application protocol-based directories.
P2P heralds the rise of tomorrow`s Internet
Once people have understood and accepted the data- and user-centred P2P models, a new architecture based on P2P principles, Web Mk 2, is expected to emerge. Web Mk 2 will utilise the data- and user-centred models, as well as an architecture known as atomistic P2P, which is used predominantly for networked gaming today. Web Mk 2 will see the evolution of the browser into a user-configurable content manager, and link users to multiple directory and indexing servers simultaneously on an ad hoc basis.
Unlike atomistic, user- and data-centred, Web Mk 2 will dramatically change how users interface with the Internet.
P2P report, Gartner Consulting, GartnerGroup
Informal Web Mk 2 browsers will turn the browsers into personalised Internet portals. The "peers", in this case, will not be clients or servers, but rather "bots" - intelligent pieces of software that sit on the browsers and data sources and communicate with one another, collecting and sending content as required.
Informal Web Mk 2 has already been targeted as a contentious technology, as users will be able to rip only the content they want from the Web, essentially cutting many Web sites` revenue streams - banner advertising - out of their browsers. Copyright is also expected to become a hurdle, as content is removed from its original Web context.
Formal Web Mk 2 will allow corporate governance to limit and control content on a per-user basis, and will typically be deployed in an intranet environment. Formal Mk 2 will also allow for distributed applications within a company, with data and logic for certain tasks distributed across the environment, and controlled by the bots.
"Unlike atomistic, user- and data-centred, Web Mk 2 will dramatically change how users interface with the Internet," states Gartner Consulting`s P2P report. "The inherent complexity of this model will require an entirely new approach to client software, pushing the current Web-browser designs beyond their capacity."
Gartner predicts that, by 2007, 20% of intranet deployments will include Web Mk 2 elements.
With millions of Pentiums on the Net, who needs mainframes?
P2P`s biggest supporter, Intel, is not too interested in sharing documents across multiple desktops, although it does save up to $400 million a year through P2P storage solutions internally.
While user-centred, data-centred, atomistic and Web Mk 2 may serve the interests of the storage solution vendors, Intel is primarily interested in processors. The architecture governing the sharing of multiple processors in distributed environments on a P2P level is known as compute-centred P2P.
Within two years of implementing [P2P], we eliminated new mainframe purchases, and mothballed several that we already had.
Pat Gelsinger, VP and chief technology officer, Intel Architecture Group</P>
Intel has used formal P2P computing since 1990 in its NetBatch compute-centred solution. The system gives Intel engineers access to the processing power of 10 000 computers. "Within two years of implementing this, we eliminated new mainframe purchases," says Intel`s Gelsinger, "and mothballed several that we already had."
Intel has also used P2P in an internal media training application. More than 2 000 participants were able to complete 60 training modules of 10MB to 20MB each across 50 sites at Intel without dropping any connections. Intel employees experienced a five- to six-fold improvement in the file access time as more than 80% of modules were delivered using P2P rather than accessing files from one server. Intel now aims to make this available to all 85 000 Intel employees.
In order to meet shorter product development cycles, Ford has contracted a company called Oculus CO to help it connect design teams located in multiple sites using different operating systems and applications. According to a report in Infoworld, the technology will save Ford between $5 million to $15 million per vehicle design program.
Informal P2P sector is immensely popular
Two popular projects that have utilised informal compute-centred architectures are SETI@Home, and more recently the Philanthropic Peer-to-Peer Program. The latter, hosted by Intel and developed by United Devices, the National Foundation for Cancer Research and the University of Oxford, is designed to find cancer cures. Intel projects that the mass of computing power that will contribute to the project will be 10 times more than the world`s most powerful supercomputer.
"Intel and the scientific community are using the PC and the power of P2P computing technology to dramatically change the way medical research is performed," says Craig Barrett, Intel`s president and CEO. "By harnessing Internet-connected PCs, this project will enable what could be the largest biological computational capability in history to help solve some of the most difficult scientific problems."
SETI@Home exploits the unutilised processing power of its users to scan the skies for extraterrestrial life. SETI@Home has managed to harness 653 338.651 years of computing time from its 2997120 volunteers since October 1998.
So far, no aliens have been discovered.
'Unwasting` 75% of computing and storage capacity
Despite the applications already available and the heavyweights behind it, P2P is still in its infancy. It has been dogged by controversy with the Napster court case, and more controversy is expected when Web Mk 2 emerges.
The benefits, however, are clear. It is estimated that most companies use only 25% of their available computing and storage capacity. P2P could help them harness the remaining 75%.
"We see P2P computing as a big inflection point in computing capabilities," says Intel`s Hogg. "We are going to see a huge marketplace for P2P, which has moved from the hype space to the reality space. It just takes some kind of catalyst - Napster - for us to talk about it."
Share