Abstract: Standards bodies like IEEE 802.3 Ethernet are developing standards for 800G and 1.6T transmission to satisfy the ever-increasing demand for bandwidth. In order to produce cost-effective and technically feasible transceivers, assumptions of the channel model need to be reevaluated. For lower speed channels, the chromatic dispersion of singe mode fibers was calculated using extreme zero-dispersion wavelength and the maximum slope allowed by ITU-T standards. The dispersion penalty from these extreme fibers is very high for wavelength division multiplexed signals operating at 200 Gbaud.
This paper evaluates the chromatic dispersion of deployed single mode fibers by studying reported zero-dispersion wavelengths and slopes. A data set of >2,500,000 fibers from multiple manufacturers on multiple continents over the last decade is used to determine realistic chromatic dispersion parameters. These fibers are used to calculate the expected maximum and minimum chromatic dispersion in 2 km and 10 km links. These results will show that a new single mode fiber standard is not needed for 800G and 1.6T transceivers, that these new transceivers can support the same reaches as earlier generations, and transceiver manufacturers can design and test their transceivers to lower amounts of chromatic dispersion while maintaining high reliability. These results will contribute to economically feasible transceivers to enable the high-speed networks of the future.
About the Presenter: Dr. Earl Parsons is the Director of Data Center Architecture Evolution at CommScope. He joined CommScope in 2014 as a Principal Optical Engineer. Prior to joining CommScope, Earl received an MS and PhD in optical sciences from the University of Arizona and was a Senior Member of Technical Staff at TE SubCom. Dr. Parsons also served as an editor of the IEEE 802.3db-2022 standard. His interests include multimode and single mode fiber optic systems to enable artificial intelligence data centers.


