What is Computer Networking?


The process of accessing and manipulating files through communications pathways between workstations, printout devices such as print servers, and storage units such as file servers.

Bandwidth and Latency

The term bandwidth in computer networking refers to the data rate supported by a network connection or interface. One most commonly expresses bandwidth in terms of bytes per second (bps). The term comes from the field of electrical engineering, where bandwidth represents the total distance or range between the highest and lowest signals on the communication channel (band).

Network bandwidth is not the only factor that determines the "speed" of a network as perceived by the end user. The other key element of network performance, latency , also affects network applications in important ways.

What is Bandwidth?

Manufacturers of network hardware have done a great job of promoting the concept of bandwidth. Virtually everyone knows the bandwidth rating of their modem or their broadband Internet service. Essentially, bandwidth represents the capacity of the connection, and it's obvious that the greater the capacity, the more likely that greater performance will follow.

Bandwidth can refer to both actual and theoretical throughput, and it is important to distinguish between the two. For example, a V.90 modem supports 56 Kbps of peak bandwidth , but due to limitations of the telephone lines and other factors, it is impossible for a home dial-up network to actually achieve this level. Likewise a Fast Ethernet network theoretically supports 100 Mbps of bandwidth but this level can never be achieved in practical use thanks to overhead in the hardware and in the computer's operating system.

High Bandwidth and Broadband

In the Internet realm, networkers sometimes uses the term high bandwidth to distinguish higher-performing Internet connections from traditional dial-up access speeds. Definitions vary, but high bandwidth connections support data rates of at least 64 Kbps and usually 200 Kbps or 300 Kbps and higher. The definition of high bandwidth differs from the definition of broadband. Technically, the term broadband refers to the method of communication and bandwidth refers to the amount of data that passes through the connection over time.

Measuring Network Bandwidth

A number of tools exist for computer networkers to measure the bandwidth of network connections. On LANs, these tools include netperf and ttcp . On the Internet, numerous "bandwidth test" or "speed test" programs exist, many made available for interactive use through public Web pages. Anyone who uses these programs quickly learns that bandwidth is a highly variable quantity that is difficult to measure precisely. In a nutshell, typical network architectures involve multiple layers hardware and software, as well as time sharing.

Bandwidth vs. Latency

Bandwidth is just one element of what a person perceives as the speed of a network. Another element of speed, closely related to bandwidth, is latency. Latency refers generally to delays in processing network data, of which there are several kinds.

Whereas theoretical peak bandwidth is fixed, actual or effective bandwidth varies and can be affected by high latencies. Too much latency in too short a period of time can create a bottleneck that prevents data from "filling the pipe," thus decreasing effective bandwidth.

Latency and Satellite Internet Service
Satellite-based Internet access service perfectly illustrates the difference between latency and bandwidth. Sattelite Internet connections possess both high bandwidth and high latency. When loading a Web page, for example, most satellite users can observe a short but noticeable delay from the time they enter a Web address to the time the page begins loading. This high latency is due primarily to propagation delay as the request message travels at the speed of light to the distant sattelite station and back to the home network. Once the messages arrive at home, however, the page appears extremely quickly, often seeming to explode onto the screen thanks to the high bandwidth of the incoming data.

Besides propagation delays, latency also may also involve transmission delays (properties of the physical medium) and processing delays (such as passing through proxy servers or making "hops" on the Internet).

Measuring Network Latency

Network tools like ping and traceroute measure latency by determining the time it takes a given network packet to travel from source to destination and back, the so-called round-trip time . Round-trip time is not the only way to specify latency, but it is the most common.

Conclusion

Two key elements of network performance are bandwidth and latency. The average network user is probably more familiar with the concept of bandwidth, but latency matters too. Businesses use the term Quality of Service (QoS) to refer to measuring and maintaining consistent performance on a network by managing both bandwidth and latency.

Fine web hosts

 

 


Materials do not contain actual questions and answers from Microsoft's Certification Exams