The Foundations of the Digital Wireless World University of Cyprus Andrew J. Viterbi Viterbi Group, LLC & University of Southern California March 5, 2010 Pre-Digital Wireless History 1870-1948 Maxwells Equations predicting electromagnetic propagation Hertz: experimental verification of propagation Marconi: wireless telegraph to ships at sea
Broadcast Radio Military Uses in WW I and WW II Radar Broadcast Television Information Theory, Satellites, and Moores Law 1948-1990 Information Theory and Its Precedents Statistical Precedents: C.R. Rao; H. Cramr Statistical Communications: N. Wiener; S.O. Rice Information Theory: Claude Shannon Mathematical Theory of Communication, Bell System Technical Journal (1948)
Source Coding Theorem Channel Coding Theorem Space and Satellites Soviet Sputnik: October 1957 U.S. Explorer I: January 1958 Initially for telemetry at very low rates--why? very low received signal power from 40,000 Km, corrupted by noise Signal-to-Noise, S/N<<1 Within 20 years, transmission of several Megabits per Second from same orbithow? Solid-State Circuit Integration Transistor at Bell Laboratories 1947
Bardeen, Brattain, Shockley Integrationmultiple devices on a chip R. Noyes, G. Moore Moores Law (1965) : Integration doubles every 18 months, with proportional Power decrease, Speed Increase and especially Decreased Cost. Increasing Satellite Communication Rates Increase Transmitted Signal Power increases launch weight Increase Receiving Antenna Diameter beyond 20 meters ? Reduce Receiver Noise Temperature
Cryogenically Reduce the Required S/N how? by Information Theory Methods Why Satellite Communication - not Terrestrial? Low Received Power and Perfect Model Shannons Two Rate Bounds Minimum Number of Bits/Second to accurately represent an Information Source (Source Coding Theorem) Maximum Number of Bits/Second which may be transmitted error-free over a perturbed medium (Channel Coding Theorem) Source Compression
Source Coding (Rate-Distortion)Theorem For data, very effective even without prior statistics (universal coding) For voice and images, it fails to account for Psychoacoustic and Psychovisual effects. Compressed Voice Voice mostly within 4 KHz Bandwidth Nyquist Rate: 8K Samples/Sec. With 8 bit Quantization: 64 Kbits/sec. CELP Compression to 8 Kbits/Sec. (8:1) CELP Voice Compression Model Vocal Tract and Vocal Chords by Digital Filter driven by small set of Excitations contained in a
codebook. Input sample sequence from codebook Digital Filter: shift register with tap multipliers Output matching voice Linear Predictive Coder with Codebook Excitation (CELP) Transmit only Filter Parameters and Index of Codebook Sample Digital Images
Analogue TV samples horizontally (approximately 450 lines per frame) Digital Images (Cameras and TV) sample entire frame 1M to 8M picture elements pixels-- in 3 primary colors High Definition TV: 1 M Pixels/Frame; 60 Frames/Sec. Results in 180M Pixels/Sec.; with 8-Bit Quantization, 1.44 Gbits/Sec. With MPEG Compression, 30 Mbits/sec. (48:1) Image Compression (JPEG/MPEG)
Divide total Pixel Grid into 16 X 16 Sub-grids. Perform Spatial Frequency Transform (Discrete Cosine TransformDCT) Quantize Low Frequency Components finely; High Frequency Components coarsely (8:1) Utilize Correlation among Colors (3:1) For TV, Utilize Correlation between Frames (2:1) Channel Coding for Gaussian Noise Shannon Channel Coding Theorem when Perturbation is Additive Gaussian Noise, R < W Log2(1 + S/N) Rate R bits/sec.; Bandwidth W Hz
Minimum Bit Energy/Noise Density R < W Log2(1 + S/N) S/N = (EbR)/(N0W) Thus R/W < Log2 [ 1 + (Eb/N0)(R/W)] And Eb/N0 > (W/R)(2R/W-1) Minimum Bit Energy-to-Noise Density 10 Eb/No (dB) 8 6 4 2
0 -2 0 2 4 6 W/R (Bandwidth/Rate) 8 10
Potential Coding Gain To keep error rate below 10-6 (one in a million), Uncoded digital communication requires Eb/N0=10.5 dB From graph, with coding, Min Eb/N0 = { 0 dB, W/R = 1 -1.6 dB, as W/R
Thus Potential Coding Gain: 10 to 12 dB Early attempts (Block Codes) achieved 3 dB gain. Convolutional Codes achieved 6 dB gain. Iterative Decoding achieved over 9 dB gain (8:1) Channel Coding and Decoding: Half Century Quest to Approach Shannon Limit Modulator/ Transmitter Coder Noisy Channel
Receiver/ Demodulator Chronology: Algebraic Block Codes (Hard Decisions) Convolutional Codes (Soft Decisions In) Iterative Decoding (Soft In-Soft OutSISO) Turbo (Convolutional) Codes Low Density Parity (Block) Codes--LDPC Hard Soft Decisions
Hard Decoder Soft Convolutional Codes (Markov State Model) L Stages u State Diagram (L = 2)
Linear Logic x andL stages Signal Selector Channel X4 y p(ylx)
11 X Decoder Problem: Given Likelihood Functions (Soft Inputs), Find Most Likely Path Traversed through Diagram Solution: Simple Algorithm 2 Adders/Comparators followed by Traceback L 01 X X
3 X 2 X6 1 00 X 0 10 X
7 5 Convolutional Codes Soft Input Only gets only part way to Shannon Limit But there have evolved Much Broader Applications of Markov Model Concept (e.g.): Speech Recognition Magnetic Recording DNA Sequence Analysis
x33 S Hidden Hidden3 xMarkov 13 Markov x12 x32 Model Model S1
S2 x21 x01 x20 S0 x00 Parting the Clouds x33 S3 x13
S1 x12 x32 S2 x21 x01 x20 S0 Examples of HMMs:
Speech Recognition DNA Sequence Alignment x00 Decoder Technology Evolution 1960s: Rack of Equipment 1970s: Single Drawer (some integration) 1980s: Silicon Chip (full integration) 1990s +: Fraction of Chip Digital Wireless Evolution Theoretical Foundations: Information Theory Application: Satellite Communication (Commercial and Direct Broadcast)
Enabling Technology: Solid-state Integration Primary Beneficiary: Personal Mobile Communication