
Indigital communication ordata transmission, (energy per bit to noise power spectral density ratio) is a normalizedsignal-to-noise ratio (SNR) measure, also known as the "SNR per bit". It is especially useful when comparing thebit error rate (BER) performance of different digitalmodulation schemes without taking bandwidth into account.
As the description implies, is the signal energy associated with each user data bit; it is equal to the signal power divided by the user bit rate (not the channel symbol rate). If signal power is in watts and bit rate is in bits per second, is in units ofjoules (watt-seconds). is thenoise spectral density, the noise power in a 1 Hz bandwidth, measured in watts per hertz or joules.
These are the same units as so the ratio isdimensionless; it is frequently expressed indecibels. directly indicates the power efficiency of the system without regard to modulation type, error correction coding or signal bandwidth (including any use ofspread spectrum). This also avoids any confusion as towhich of several definitions of "bandwidth" to apply to the signal.
But when the signal bandwidth is well defined, is also equal to the signal-to-noise ratio (SNR) in that bandwidth divided by the "gross"link spectral efficiency in(bit/s)/Hz, where the bits in this context again refer to user data bits, irrespective of error correction information and modulation type.[1]
must be used with care on interference-limited channels since additive white noise (with constant noise density) is assumed, and interference is not always noise-like. Inspread spectrum systems (e.g.,CDMA), the interferenceis sufficiently noise-like that it can be represented as and added to the thermal noise to produce the overall ratio.
is closely related to thecarrier-to-noise ratio (CNR or), i.e. thesignal-to-noise ratio (SNR) of the received signal, after the receiver filter but before detection:
where
is the channel data rate (net bit rate) and
B is the channel bandwidth.
The equivalent expression in logarithmic form (dB):
Caution: Sometimes, the noise power is denoted by when negative frequencies and complex-valued equivalentbaseband signals are considered rather thanpassband signals, and in that case, there will be a 3 dB difference.
can be seen as a normalized measure of theenergy per symbol to noise power spectral density ():
where is the energy per symbol in joules andρ is the nominalspectral efficiency in (bits/s)/Hz.[2] is also commonly used in the analysis of digital modulation schemes. The two quotients are related to each other according to the following:
whereM is the number of alternative modulation symbols, e.g. for QPSK and for 8PSK.
This is the energy per bit, not the energy per information bit.
can further be expressed as:
where
is thecarrier-to-noise ratio orsignal-to-noise ratio,
B is the channel bandwidth in hertz, and
is the symbol rate inbaud or symbols per second.
TheShannon–Hartley theorem says that the limit of reliableinformation rate (data rate exclusive of error-correcting codes) of a channel depends on bandwidth and signal-to-noise ratio according to:
where
I is theinformation rate inbits per second excludingerror-correcting codes,
B is thebandwidth of the channel inhertz,
S is the total signal power (equivalent to the carrier powerC), and
N is the total noise power in the bandwidth.
This equation can be used to establish a bound on for any system that achieves reliable communication, by considering a gross bit rateR equal to the net bit rateI and therefore an average energy per bit of, with noise spectral density of. For this calculation, it is conventional to define a normalized rate, a bandwidth utilization parameter of bits per second per half hertz, or bits per dimension (a signal of bandwidthB can be encoded with dimensions, according to theNyquist–Shannon sampling theorem). Making appropriate substitutions, the Shannon limit is:
Which can be solved to get the Shannon-limit bound on:
When the data rate is small compared to the bandwidth, so that is near zero, the bound, sometimes called theultimate Shannon limit,[3] is:
which corresponds to −1.59 dB.
This often-quoted limit of −1.59 dB appliesonly to the theoretical case of infinite bandwidth. The Shannon limit for finite-bandwidth signals is always higher.
For any given system of coding and decoding, there exists what is known as acutoff rate, typically corresponding to an about 2 dB above the Shannon capacity limit.[citation needed]The cutoff rate used to be thought of as the limit on practicalerror correction codes without an unbounded increase in processing complexity, but has been rendered largely obsolete by the more recent discovery ofturbo codes,low-density parity-check (LDPC) andpolar codes.