Question:medium

Consider a 100 Mbps link between an earth station (sender) and a satellite (receiver) at an altitude of 2100 km. The signal propagates at a speed of \(3 \times 10^8 \ \text{m/s}\). The time taken (in milliseconds, rounded off to two decimal places) for the receiver to completely receive a packet of 1000 bytes transmitted by the sender is

Show Hint

The total time for packet reception includes both propagation delay and transmission time.
Updated On: Jan 30, 2026
Show Solution

Correct Answer: 7.07

Solution and Explanation

Given Parameters:

  • Distance ($d$) = 2100 km = $2.1 \times 10^6$ m
  • Propagation Speed ($s$) = $3 \times 10^8$ m/s
  • Packet Size ($L$) = 1000 bytes = 8000 bits
  • Bandwidth ($B$) = 100 Mbps = $10^8$ bits/sec

Step 1: Calculate Propagation Delay ($T_p$)

This is the time required for a single bit to travel from the sender to the receiver over the physical distance:

$T_p = \frac{d}{s}$ $T_p = \frac{2.1 \times 10^6 \text{ m}}{3 \times 10^8 \text{ m/s}} = 0.007 \text{ seconds} = \mathbf{7.0 \text{ ms}}$


Step 2: Calculate Transmission Delay ($T_t$)

This is the time it takes for the sender to "emit" the entire 1000-byte packet into the link:

$T_t = \frac{L}{B}$ $T_t = \frac{8000 \text{ bits}}{100 \times 10^6 \text{ bits/sec}} = 0.00008 \text{ seconds} = \mathbf{0.08 \text{ ms}}$


Step 3: Sum the Total Latency

The packet is considered "completely received" only after the last bit has finished both the transmission and propagation phases:

$Total\ Delay = T_p + T_t$ $Total\ Delay = 7.0 \text{ ms} + 0.08 \text{ ms} = \mathbf{7.08 \text{ ms}}$


Final Answer:

The time taken to completely receive the packet is: 7.08 ms

Was this answer helpful?
0