Shannon’s noisy channel coding theorem tells us that the capacity is the maximum rate we can transmit information reliably over a noisy channel in the class of memoryless channels. In general, computing the capacity is a difficult problem. As such, there has been extensive work on asymptotics.
In the case of the additive Gaussian noise channel, the capacity is well known. However, it is still interesting to characterize the behavior of the asymptotes (as the SNR tends to zero or infinity) for use in proofs or to provide simple design guidelines for real-world communication systems.
Despite the work on asymptotes, it is more difficult to characterize the behavior at medium SNR without using the exact expression for the capacity.
In this post, I look at the medium behavior of the Gaussian noise channel. It turns out the SNR of decibels is particularly special and suggests a way of obtaining simple capacity approximations at medium SNR for general classes of additive noise channels.