Gaussian noise channels at medium SNR

Shannon’s noisy channel coding theorem tells us that the capacity is the maximum rate we can transmit information reliably over a noisy channel in the class of memoryless channels. In general, computing the capacity is a difficult problem. As such, there has been extensive work on asymptotics.

In the case of the additive Gaussian noise channel, the capacity is well known. However, it is still interesting to characterize the behavior of the asymptotes (as the SNR tends to zero or infinity) for use in proofs or to provide simple design guidelines for real-world communication systems.

Despite the work on asymptotes, it is more difficult to characterize the behavior at medium SNR without using the exact expression for the capacity.

In this post, I look at the medium behavior of the Gaussian noise channel. It turns out the SNR of 0 decibels is particularly special and suggests a way of obtaining simple capacity approximations at medium SNR for general classes of additive noise channels.

More precisely, I will look at the role of 1 in the additive Gaussian noise channel Y = \sqrt{snr}X + N, where N is a standard Gaussian random variable (zero mean, variance of 1) and snr is the signal-to-noise ratio (SNR). It is well known that in this case, the capacity (subject to a power constraint) is given by

C = \frac{1}{2}\log_2(1 + snr) bits.

In information and communication theory, it is common to work in decibels (dB) and we can write snr = 10^{\rho/10}, where \rho is the SNR in decibels. Here are three situations where having snr = 1 or \rho = 0 = 10\log_{10}(1) dB is particularly special.

(1) The intercept of the asymptote of the capacity C as snr \rightarrow \infty is 0 dB.

As the SNR in dB \rho \rightarrow \infty, we can write

C^{\infty} = \frac{1}{2}\log_2(10^{\rho/10}) = \frac{\rho}{20}\log_2(10),

which is the linear asymptote. Observe that setting C^{\infty} = 0 means that the intercept \rho_{int} = 0 dB, as claimed.

(2) The bend point of the capacity C occurs at \rho = 0 dB.

It is common to study the capacity as either the SNR tends to infinity or the SNR tends to zero. This is because, in general, it is difficult to obtain a simple characterization of the behavior at medium levels of SNR.

One approach to characterize the behavior of the capacity at medium SNR is to evaluate the bend point.

Definition: The bend point, \rho_{bend}, of the capacity is the SNR \rho such that the second derivative of the capacity C(10^{\rho/10}) is maximized.

This might seem an odd definition, but there is a lot of intuition here. In particular, observe that the second derivative provides information about how the rate of change of the capacity is varying. By finding where the second derivative is maximized, we find where the curve is the most bent.

A figure helps to illustrate the situation. Observe in Fig. 1 that as \rho \rightarrow 0, the slope of the capacity curve tends to zero. As the SNR is increased, the slope starts to increase until it reaches the high SNR asymptote, where it is once again constant. As such, the second derivative is zero as both \rho \rightarrow 0 and \rho \rightarrow \infty. In between, the second derivative is positive and the bend point its maximum.

capacity-1

Fig. 1: Plot of the Gaussian capacity and asymptotic capacity curves.

To find the bend point, it is not hard to show that the second derivative of the capacity has a unique maximum. As such, we can compute the third derivative and find where it is zero. After changing our units to nats (i.e., the logarithm is base e), we need to solve

(\log 10/10)^3 (10^{\rho/10} - 10^{2\rho/10})/(1 + 10^{\rho/10})^3 = 0.

Observe that the solution to this equation is \rho_{bend} = 0 dB, as claimed.

(3) The cross-over point between the capacity curve and the MMSE curve is 0 dB.

Suppose we want to estimate the Gaussian input X from the observation Y = X + N, where as usual N is Gaussian noise. The error of an estimate f(Y) can be measured in the mean-square sense

E[(X - f(Y))^2].

The choice of f(Y) to minimize the mean-square error is achieved by

\hat{X}(Y;snr) = E[X|Y;snr] = \frac{\sqrt{snr}}{1 + snr}Y.

The minimum mean-square error (MMSE) is then given by

mmse(snr) = mmse(X|\sqrt{snr}X + N) = \frac{1}{1 + snr}.

To find the cross-over point (illustrated in Fig. 2), we then need to solve

\frac{1}{2}\log_2(1 + snr) = \frac{1}{1 + snr},

which has the solution snr = 1, equivalent to \rho = 0 dB.

capvsmmse

Fig. 2: Plot of the capacity and MMSE curves.

Conclusions

We have seen that a signal-to-noise ratio snr = 1 or \rho = 0 dB is special in the case of additive Gaussian noise channels. Although these facts may appear to be trivial, the fact that all three hold makes the compelling suggestion that there are medium SNR features of the capacity that are  connected to the asymptotic behavior of the capacity and also the behavior of the MMSE.

Further Reading

For more on the bend point and an application to multiuser MIMO, see

  • Egan, M., “Low-high SNR transition in multiuser MIMO,” Electronic Letters, vol. 51, no. 3, pp. 296-298, 2015.

For more on connections between the mutual information and MMSE in the Gaussian case, see

  • Guo, D., Shamai, S. and Verdú, S., “Mutual information and minimum mean-square error in Gaussian channels,” IEEE Transactions on Information Theory, vol. 51, no. 4, pp. 1261-1282, 2005.
  • Guo, D., Wu, Y., Shamai, S. and Verdú, S., “Estimation in Gaussian noise: properties of the minimum mean-square error,” IEEE Transactions on Information Theory, vol. 57, no. 4, pp. 2371-2385, 2011.

For connections in non-Gaussian noise, see

  • Guo, D., Shamai, S. and Verdú, S., “Additive non-Gaussian noise channels: mutual information and conditional mean estimation,” in Proc. of the IEEE International Symposium on Information Theory, 2005.
Advertisements