By : Jim Pinto, Your desktop computer (like most others) is only utilized about 5%. Distributed computing uses the idle time and links many machines together to perform mammoth tasks that previously only super-computers could do. The whole area of distributed and grid computing is a hot bed of significant development that is expected to generate amazing advances in the next few years. And the first big applications are already here.
Automation.com, May 2003
|
You probably know that your desktop computer (like most others) is only utilized about 5% - the CPU stays idle most of the time. Distributed computing uses this idle time and links many machines together to perform mammoth tasks that previously only super-computers could do.
Peer-to-peer networksThe traditional client-server Internet model ("client" asks for and receives information from "server") is beginning to give some ground to peer-to-peer (P2P) networking - where all network participants are approximately equal. The primary advantage of P2P networks is that large numbers of people share the burden of providing computing resources (processor time and disk space), administration effort, creativity and legal liability. It's relatively easy to be anonymous in such an environment and it's harder for opponents of a P2P service to bring it down.Napster was one of the best-known examples to date, though it's not a completely P2P system because users register their file libraries with a central server when they log on to the service. Because of this central link, it was possible for opponents to force Napster to stop facilitating free music downloads. This was done by legal action on behalf of the music industry, to protect copyright infringement. However, true P2P networks such as Kaaza and Morpheus now operate with impunity, because there is no central server, They are now generating traffic many times the rate of Napster at its peak. P2P has some limitationsP2P does have problems. The primary disadvantage is the tendency of computers at the edge of the network to fade in and out of availability. Also, accountability for the actions of network participants could be a difficult problem. Several high-profile implementations have shown that architecture, security, and systems management issues are difficult to control. For these reasons, system managers often prefer to operate P2P systems as separate isolated entities. But, doing so is often impossible for practical applications.The increasing reach of P2P beyond the desktop and into distributed computing fringe may further magnify these stumbling blocks. But that will not hold back the rising tide of P2P implementations. Dramatic cost savings and the ability to do parallel processing will continue to drive P2P use higher. Distributed ComputingThese days, there are more tools than ever to help companies harness unused computing power in the hundreds of PC being used by their employees. The enormous power of P2P computing is increasingly being blended with its more complex cousin, distributed (or grid) computing.Traditionally, there have been three categories of "distributed" computing:
Grid ComputingGrid Computing can be defined as applying resources from many computers in a network to a single problem, usually one that requires a large number of processing cycles or access to large amounts of data.The computational power grid is analogous to electric power grid. It allows the coupling of geographically distributed resources to offer consistent and inexpensive access to resources irrespective of their physical location or access point. The Internet or dedicated networks can be used to interconnect a wide variety of distributed computational resources (such as supercomputers, computer clusters, storage systems, data sources) and present them as a single, unified resource. At its core, Grid Computing enables devices-regardless of their operating characteristics-to be virtually shared, managed and accessed across an enterprise, industry or workgroup. This virtualization of resources places all of the necessary access, data and processing power at the fingertips of those who need to rapidly solve complex business problems, conduct compute-intensive research and data analysis, and operate in real-time. A company with slightly fewer than 2,000 desktop computers can harvest nearly 1 teraflop (one trillion floating-point operations per second) of computing capacity. Even better, the company can capture that power from computers it already owns that sit idle at night and work at less than full capacity during the day. Universities and research institutions have long used grid-computing technology, but recently it is also making fast inroads into the business market. IBM has recently come up with operating standards that will generate many new business applications and will cause wide proliferation. Internet Grid ComputingThe Internet is evolving beyond e-mail, content, and electronic commerce. It is becoming a true platform, combining the qualities of service of enterprise computing with the ability to share distributed resources across the web - applications, data, storage, servers, and everything in-between.
ApplicationsThe whole area of distributed computing is a hot bed of significant development that is expected to generate amazing advances in the next few years. And the first big applications are already here.United Technologies, the $28b manufacturing conglomerate, is equipping more than 100,000 Wintel computers with its own peer-to-peer software to do scientific calculations and solves complex modeling problems during off hours. The project is an expansion of what the company's Pratt & Whitney aircraft engine division has done to phase out a Cray supercomputer with 5,000 Sun Unix workstations, to perform design simulations for aircraft parts. The result: 85 percent utilization for each workstation. With 20,000 PCs in Pratt & Whitney, the P2P project is expected to cut in half the time and money it takes to develop turbine engines and other aircraft parts, mainly by eliminating multimillion-dollar prototypes. Previously, it could take $1 billion and five years between the time an engine is developed and certified. Universities and research organizations have used similar peer-processing approaches to solve complex scientific problems. The most famous is a project called SETI@home, which uses the computers of volunteers across the globe to search for life on other planets. An Intel philanthropic P2P Program helps to combat life-threatening illnesses by linking millions of PCs into what is predicted to be the largest and fastest computing resource in history. This "virtual supercomputer" uses P2P technology to make unprecedented amounts of processing power available to medical researchers to accelerate the development of improved treatments and drugs that could potentially cure diseases. If you feel P2P and grid-computing are important, you might enjoy reading a recent book: "Peer-to-Peer: Harnessing the Power of Disruptive Technologies", a collection of chapters written by the people who are driving the state of the art in the P2P space. This book is NOT about how to build the next Morpheus or Kaaza system. Rather, it aims to get its readers thinking about what happens when information systems shift away from client-servers towards the peer-to-peer model.
Advertising
Interested in learning more about Computers? CBT Training can help you master most computer programs and could even help you change Careers with IT Certification. Learn about popular operating systems with A+ Certification, CISSP Certification, or Microsoft Office Training. Finding time to study is easy too with Training Courses on CD.
Related links: |
Return to Index of all JimPinto Writings
Return to JimPinto.com HomePage
If you have ideas or suggestions to improve this site, contact: webmaster@jimpinto.com