How Does Serial Communication Work?

How Does Serial Communication Work?
You know that most PCs have both serial and parallel ports. You also know that electricity can only move at one speed. One way to get bits to move faster through a wire is to compress the data so that less bits are necessary and then require less time on the wire, or transmit the bits simultaneously. Computers make use of relatively short parallel connections between interior components, but use a serial bus to convert signals for most external communications.

Let's compare serial and parallel communications.

With a serial connection, information is sent across one wire, one data bit at a time. The 9-pin serial connector on most PCs uses two loops of wire, one in each direction, for data communication, plus additional wires to control the flow of information. In any given direction, data is still flowing over a single wire.
A parallel connection sends the bits over more wires simultaneously. In the case of the 25-pin parallel port on your PC, there are eight data-carrying wires to carry 8 bits simultaneously. Because there are eight wires to carry the data, the parallel link theoretically transfers data eight times faster than a serial connection. So based on this theory, a parallel connection sends a byte in the time a serial connection sends a bit.

This explanation brings up some questions. What is meant by theoretically faster? If parallel is faster than serial, is parallel more suitable for connecting to a WAN? In reality, it is often the case that serial links can be clocked considerably faster than parallel links, and they achieve a higher data rate, because of two factors that affect parallel communications: clock skew and crosstalk interference.

In a parallel connection, it is wrong to assume that the 8 bits leaving the sender at the same time arrive at the receiver at the same time. Rather, some of the bits get there later than others. This is known as clock skew. Overcoming clock skew is not trivial. The receiving end must synchronize itself with the transmitter and then wait until all the bits have arrived. The process of reading, waiting, latching, waiting for clock signal, and transmitting the 8 bits adds time to the transmission. In parallel communications, a latch is a data storage system used to store information in sequential logic systems. The more wires you use and the farther the connection reaches, compounds the problem and adds delay. The need for clocking slows parallel transmission well below theoretical expectations.

This is not a factor with serial links, because most serial links do not need clocking. Serial connections require fewer wires and cables. They occupy less space and can be better isolated from interference from other wires and cables.

Parallel wires are physically bundled in a parallel cable, and signals can imprint themselves on each other. The possibility of crosstalk across the wires requires more processing, especially at higher frequencies. The serial buses on computers, including routers, compensate for crosstalk before transmitting the bits. Since serial cables have fewer wires, there is less crosstalk, and network devices transmit serial communications at higher, more efficient frequencies.

In most cases, serial communications are considerably cheaper to implement. Serial communications use fewer wires, cheaper cables, and fewer connector pins.
0 comments for "How Does Serial Communication Work?"