The earliest examples of P2P networks came from the first freestanding PCs in the 1980s – when computers no longer needed to be linked to a central system but were completely self-contained. While this meant more freedom for the user, it was also much more difficult to share data or send something to be printed without having to save it to a floppy disk and physically take it to the user who had a PC connected to the printer. The development of peer-to-peer (P2P) networks allowed all computers to be connected to each other and share resources, such as printer access. Each computer on a P2P network is called a ‘peer’ or ’node’. Each node serves as both a client and a server – they make use of the resources while simultaneously providing resources to others. These might be files, access to a printer, storage, bandwidth or processing power. Each user can decide what can be shared from their PC with access rights. Other networks, such as client/server networks, might have a separate server computer or need server software – but these are unnecessary for P2P networks. In direct opposition to client/server networks, no specific peer in a P2P network has authority. This egalitarian network means that each PC has the same rights regarding communication, finding and using resources, and validating other users. A P2P network can be as simple as linking two computers using an ad-hoc connection through USB or a permanent connection using copper wires like in a small office, or up to a much larger network operating across the internet using special protocols and applications.

Types of Peer-to-Peer Networks

There are three different types of peer-to-peer networks. Each has slightly different characteristics and benefits and, more importantly, different levels of complexity. In terms of hardware, the link between peers can be virtual or physical – and the minimum physical connection is a simple USB.

Unstructured Networks

These are the simplest type of P2P network. The nodes (PCs) are connected on a random basis. They are simple to build – making them suitable for most situations. The nature of the unstructured network makes it straightforward for localized optimizations, but the shared resources can be difficult to find. Files might be stored with more than one peer; and when a search request goes into the network, the most popular files are easy to find. However, a rarer file – one that is not shared on many nodes – can be lost in the ‘flood’ of requests, making it much harder to find that content. Unstructured networks tend to use more CPU and memory, but they are less affected by the number of peers joining or leaving the network (churn rate).

Structured Networks

Unlike the unstructured network, the structured P2P network is organized into an arrangement based on a distributed hash table (DHT). DHT is an advanced form of lookup or search system that allows nodes to access data, such as files, through the use of a key instead of having to make a copy of the data on every node. These keys are formed through hashing – whereby data of varying sizes are assigned generated values of the same size (for example, a mix of 10 digits and letters). This contrasts with unstructured P2P networks, where whole files may be stored on more than one node. DHT assigns ownership of a particular file to a specific peer using a variant of hashing called ‘consistent hashing’. When a new peer joins a P2P network, normal hashing requires that all keys be regenerated. Consistent hashing is less power-intensive as only some keys need regenerating. Overall, it is easier, and less power- and memory-intensive, to search a structured network for content than an unstructured one. However, this type of network is more problematic with high churn rates – the routing of requests and information depends on each peer knowing what is available to download and other criteria of the neighbouring node, which must be ‘relearned’ when peers leave or join the network as the neighbours change.

Hybrid Structure

This is a structure that looks more like a traditional client/server network. A centralized peer is in control, performing server-like activities with optimum knowledge of the location of files and what resources can be shared. This is a more complicated network and can prove more intensive to set up. While it makes many routing requests faster, it puts more pressure on the centralized peer, requiring it to potentially have more power and use more CPU than other nodes. It is a move that is much closer to having a dedicated server and an administrator. As a P2P sharing network, the idea was that peers connected through the internet and could find and download any song they wanted, from several other users. Peers could also upload songs to Napster themselves, then share their files with others. However, Napster (and LimeWire, a similar site that was popular around the same time but is now discontinued) soon came under fire as the songs placed in the depository for download infringed on the copyright of the record labels and musicians.

BitTorrent

Designed to make it easy to share large files, via several peers sharing smaller ‘bits’ of the file which are then joined up after download, BitTorrent was launched in 2001. Although BitTorrent needs specific software to join up the downloaded files from the (potentially) thousands of sources, it is a straightforward process that does not require a constant connection – after any interruption, the downloading and uploading can continue seamlessly. BitTorrent is responsible for up to 72% of global peer-to-peer traffic – and is one of the main sources of illegal downloads and copyright infringements. Specific torrent trackers like The Pirate Bay and Torrentz became popular for users to find the right torrents with the most peers available, and these trackers have been at the heart of court cases regarding piracy and illegal downloads. It is worth mentioning that legitimate companies use BitTorrent technology, making it impossible for internet providers to block access to the application. These include big names in the gaming industry, like Blizzard – who share updates using P2P networks for games like World of Warcraft and Diablo II.

Microsoft

In certain Windows iterations, updates can be downloaded from multiple sources using a P2P network because it is usually faster and more reliable – especially if your internet connection itself is unreliable. Windows 10 has this option for some operating systems. Microsoft has also created peer-to-peer networking protocols that are designed to be used for businesses so that employees can work remotely and still have all the access they need to required files and resources. This is considered more reliable and safer than a remote server – and costs much less to set up.

Skype

On a smaller scale, video-sharing platforms like Skype operate using a peer-to-peer network through the Skype application. Both parties are peers, seamlessly sharing video and audio with each other through simultaneous uploads and downloads.

Final Thoughts

Peer-to-peer networks are a simple way for several different computers to share resources, whether content or devices, without relying on a server. Much more cost-effective than a client/server network for smaller businesses, choosing even a permanent P2P network option like copper wire connections is more manageable in terms of both initial and ongoing costs. Some P2P networks are designed with a particular use in mind – like BitTorrent for sharing large files through much smaller ‘bits’ that are reassembled at the end – while others are specifically designed by companies like Microsoft to enable simple sharing in a home or business environment. As a cost-effective, simple way to create a network, P2P protocols are straightforward and useful – but it is always worth considering how you want to structure the network you create to get the best results.