Why It Matters in Communication Systems
- High-Speed Data Transfer
- SerDes enables gigabit-per-second data rates, which are essential for applications like Ethernet, PCIe, USB, and optical fiber communication.
- Bandwidth Efficiency
- By serializing data, SerDes allows more information to be sent over fewer channels, maximizing the use of available bandwidth.
- Signal Integrity
- Serial transmission reduces issues like crosstalk and electromagnetic interference (EMI) that are common in parallel buses, especially over long distances.
- Scalability
- As systems scale (e.g., data centers, 5G networks), SerDes allows for denser interconnects without increasing complexity or cost.
- Power and Space Savings
- Fewer wires and connectors mean lower power consumption and smaller form factors, which are critical in mobile and embedded systems.
SerDes is essential in environments where bandwidth is critical and space is limited, such as:
- On-chip interconnects (e.g., between processor cores)
- Chip-to-chip communication (e.g., between ASICs or FPGAs)
- Board-to-board and system-level links (e.g., in servers and switches)
Historical Trends in SerDes Technology
Over the past two decades, SerDes technology has advanced dramatically to keep pace with the explosion of data and compute demands. Here's a breakdown of key milestones:
- 10G SerDes (Early 2000s): Enabled 10 Gigabit Ethernet and early high-speed backplanes.
- 28G SerDes (2010s): Supported 100G Ethernet, PCIe Gen3, and early optical modules.
- 56G SerDes (Mid-2010s): Powered 400G Ethernet, PCIe Gen4, and high-density switch fabrics.
- 112G SerDes (2020s): Enabled 800G Ethernet, PCIe Gen5/6, and AI/ML accelerator interconnects.
- 224G SerDes (Emerging): The latest generation, targeting 1.6T Ethernet, PCIe Gen7, and ultra-high-speed chiplet communication.
Each generation has required innovations in signal processing, packaging, and materials to overcome challenges like signal loss, crosstalk, and power efficiency.
Where is SerDes Used?
SerDes is a cornerstone of modern digital infrastructure. Its applications span across:
- Data Centers: High-speed links between servers, switches, and storage systems.
- Networking Equipment: Routers, switches, and optical transceivers rely on SerDes for fast, reliable data movement.
- AI/ML Accelerators: GPUs and custom silicon (e.g., TPUs) use SerDes to move massive datasets between compute nodes.
- Storage Systems: Interfaces like NVMe and SAS use SerDes to achieve high throughput and low latency.
- Consumer Electronics: HDMI, DisplayPort, and USB use SerDes to transmit video, audio, and data over compact cables.
Why Are SerDes Speeds Increasing?
The demand for faster SerDes is driven by several converging trends:
- AI and Machine Learning: Training large models requires moving petabytes of data quickly between compute elements.
- Cloud and Edge Computing: More users, more devices, and more data mean higher throughput requirements.
- Power and Area Efficiency: Higher-speed SerDes reduce the number of lanes needed, saving board space and lowering power per bit.
- Next-Gen Standards: Technologies like 1.6T Ethernet, PCIe Gen6/7, and CXL require ultra-fast, low-latency interconnects.
As workloads scale, SerDes must evolve to deliver more bandwidth without compromising reliability or efficiency.
Modulation Techniques in 56G/112G/224G SerDes
To achieve 224 Gbps per lane, SerDes systems use advanced modulation schemes that increase the number of bits transmitted per symbol:
- PAM4 (Pulse Amplitude Modulation with 4 levels): Transmits 2 bits per symbol. Widely used in 56G and 112G SerDes.
- PAM6/PAM8: Experimental schemes that transmit 2.6 or 3 bits per symbol, respectively. These are being explored for 448G to improve bandwidth efficiency.
- Coherent Modulation: In optical links, coherent techniques (e.g., QPSK, QAM) may be used for long-reach, high-capacity transmission.
These schemes improve data rates but also increase sensitivity to noise, jitter, and signal distortion, requiring more sophisticated equalization and error correction.
Why is Reference Clock Jitter So Important?
At ultra-high speeds like 56G/112G/224G, reference clock jitter becomes a critical performance limiter. Here's why:
- Signal Integrity: Jitter introduces timing uncertainty, which can cause sampling errors and degrade the eye diagram.
- Receiver Performance: High jitter can overwhelm the receiver’s clock and data recovery (CDR) circuits, leading to bit errors.
- Standards Compliance: Protocols like IEEE 802.3 and OIF-CEI define strict jitter budgets to ensure interoperability across vendors.
Minimizing jitter is essential for maintaining a wide, open eye diagram and ensuring reliable, high-speed communication. Techniques like low-jitter PLLs, clean power supplies, and careful PCB layout are crucial in SerDes design.
Recommended Skyworks Clocks
The following Skyworks clocks are recommended based on reference clock jitter requirements