To stay ahead of demand for high-bandwidth capabilities and to deliver on the promise of emerging services and applications, multiple system operators (MSOs) are evolving their hybrid fiber/coax (HFC) networks. Architectures like Remote PHY and DOCSIS 3.1 depend on fiber-based bandwidth deep into the CATV network. That’s why many operators are pursuing “fiber deep” initiatives to create Node+0 architectures.
So how can operators bring fiber closer to their subscribers while optimizing the installed optical base? In many instances, a multiwavelength architecture like coarse wavelength division multiplexing (CWDM) and dense wavelength division multiplexing (DWDM) can deliver the best value – along with changes and challenges to the cabling infrastructure. This article, which provides background information on multiwavelength technology and why the move to fiber deep is taking place now, is the first in a series that describes several aspects of the extension of fiber deeper into cable networks. Other articles in the series cover WDM component technology and specification parameters, multiwavelength transmission choices and the impairment factors that should be considered, and network architecture and cabling strategies.
Background on today’s fiber deep migration
CATV operators have been deploying fiber since the early to mid-1990s. HFC networks were traditionally designed using one or two wavelengths (1310 and 1550 nm), with fiber delivered (or “fed”) to a node and coax then used to “distribute” the RF signal to the customer. See these “feeder” and “distribution” sections in Figure 1.
In this network configuration, the feeder fibers are used to service the nodes and businesses are given a dedicated fiber – a winning approach until operators began running out of fibers. To counter this fiber depletion, operators began deploying CWDM to deliver wavelength services to businesses. This network was overlayed on the existing infrastructure and is shown in Figure 2.Today, operators are driving fiber deep for Node+0 architectures. Again, fibers in the feeder portion of the network have been depleted but operators this time have turned to DWDM to deliver fiber deeper or closer to the customer. The DWDM wavelengths are delivered to a “parent” node, then distributed deeper into the network to a “child” node. In many cases this network is overlayed on top of existing infrastructure, driven by the desire to reuse as much feeder fiber as possible and to add new cable to the child nodes (see Figure 3).
Moving from diplexors/triplexors (1-2 wavelengths) and CWDM (16-18 wavelengths) to DWDM (40+ wavelengths) will bring network cabling and components changes. To get a better understanding of what these changes may bring to the network, let’s dive into the CWDM and DWDM standards.
CWDM and DWDM standards and definitions
The International Telecommunications Union (ITU) is the standards body that provides guidance for CWDM and DWDM. One of its three divisions, the ITU Telecommunication Standardization Sector (ITU-T), coordinates standards for telecommunications. See Figure 4 for a look across some of the standards the ITU-T covers for an HFC network.
The first areas that the ITU-T defines are the operational bands where CWDM and DWDM are deployed. The bands are defined as O-band (original), E-band (extended), S-band (short), C-band (conventional) and L-band (long).
Single-mode fiber transmission began in the O-band and was developed to take advantage of the transmission performance of the glass fiber at 1310 nm.
To take advantage of the lower loss at 1550 nm, fiber was developed for the C-band. As links became longer and fiber amplifiers began being used instead of optical-to-electronic-to-optical repeaters, the C-band became more important. With the adoption of DWDM systems, use of this band was expanded. Development of new fiber amplifiers continued to expand DWDM upward to the L-band.
Figure 5 illustrates these bands. The complete electromagnetic (EM) spectrum is used as a reference.
ITU-T G.694.1 and G.694.2 are the standards that cover DWDM and CWDM. ITU-T G.694.1 (DWDM) provides information including:
- Definitions of frequency grid: slot and width
- Center frequencies, including the nominal central frequencies within the C-band and L-band
- Frequency grid, including recommended support for a variety of fixed channel spacings ranging from 12.5 to 100 GHz.
Figure 6 shows an example of a frequency grid.
ITU-T G.694.2 (CWDM) defines key information on:
- Definitions on wavelength grid: spacing and width
- Nominal central wavelengths; the grid wavelengths are within the range 1271 to 1611 nm
- Central wavelength spacing and wavelength variation. Effective CWDM with uncooled lasers and wide passband filters require a nominal central wavelength spacing of not less than 20 nm.
Figure 7 shows an example of a wavelength grid.
Applying ITU-T G.694.1 and G.694.2 grids across the bands is shown in Figure 8.
The CWDM channel lineup runs across all bands (O-L), while DWDM remains in the C-band. MSOs have used multiple bands and still use 1310, 1550 and 1490 nm for HFC networks today. Fiber deep architectures like Node+0 will use the more of the C-band to deliver 40 or more wavelengths.
The advantages and challenges of using more DWDM include:
- Maximum capacity systems available; up to 160 channels across the C+L bands
- Maximum distance with optical amplifiers; extended distances can ease headend consolidation
- Amplification vs. regeneration; no optical-to-electrical-to-optical signal processing
- Optical amplified systems can bring nonliner transmission effects
- Requires higher-performance lasers and optical components
- Needs more power per wavelength.
With this background information in hand, you’re now ready to consider WDM component technology and specification parameters, multiwavelength transmission choices and impairment factors, and network architecture and cabling strategies.
David Kozischek is an applications marketing manager at Corning Optical Communications.