propagation delay - Computer Definition

The time required for a signal to travel from one point to another, generally from a transmitter through a medium to a receiver. Propagation delay is dependent on the nature of the electromagnetic signal, as not all signals travel at the same speed through a medium. Propagation delay also is influenced by the distance between the two points, the density of the medium, and the presence of passive devices such as loading coils that might increase the impedance of the medium. See also impedance, loading coil, medium, and velocity of propagation (Vp).

The time it takes to transmit a signal from one place to another. Propagation delay is dependent solely on distance and two thirds the speed of light. Signals going through a wire or fiber generally travel at two thirds the speed of light. Contrast with nodal processing delay.