Keep your Cool in the Headend

May 16, 2017
There are lessons to be learned about headend cooling from experiences and practices already in place in data centers. Corning Optical Communications presented a case study at the recent Energy 2020 ...

There are lessons to be learned about headend cooling from experiences and practices already in place in data centers. Corning Optical Communications presented a case study at the recent Energy 2020 plenary session hosted by the SCTE/ISBE.

Corning outlined steps that can be taken to reduce power consumption in the headend, which accounts for 3-7% of the total utilized by an operator. The first step is to take a look at the efficiency of servers being used as they are what is powering up all the equipment. While a data center might have 10,000+ servers, a headend is more on the scale of 800-1,000.

"With older models … it is like changing out the heating and air conditioning at home. If you have an older, less efficient unit, you want to replace it with a more efficient unit," said Jason Morris, North American marketing specialist, Corning Optical.

Management support is crucial. "You are not going to be able to do energy conservation and save money on power consumption without the support of high-end leadership," Morris said.

And focus by Energy 2020 highlights the similarities between data centers and headends as they move closer to the customer. "Management techniques that have been applied well in the data center are starting to migrate toward customers and headends. This is the nature of progression and the sweet spot of 2020: to rally operators to focus on moving energy savings to the customer," said Derek DiGiacomo, the SCTE's senior director, information systems and energy management program.

Fiber is Corning Optical's wheelhouse, and the company has been working to help data companies reduce heat inside the centers by encouraging them to pull out heat-producing copper in exchange for fiber, something which also helps save on physical space, Morris said. MPO connectors have 12 fibers but only one connector to plug in.

"Copper cables require power to run them, but fiber only requires light. Another effect of reducing the footprint is better airflow," Morris said. "(Another) problem the data centers we support had outside of the physical footprint not being large enough to support their (business) is the limitation of power they can receive from the utility company."

The cost of removing heat is related to the cost of operation of a data center or headend. For every unit of electric power used, you have to add another unit of cooling to the cost. This means for every dollar you use to operate the equipment, you are using another dollar to remove heat.

"It is a very easy way to do a quick analysis," Morris said. "If you are not paying for power to run copper, it reduces the operating cost on a one-to-one ratio."

While many data centers were built with cooling in mind, headends were not. Hot and cold aisles are another way to reduce associated costs. "Cost savings over time will pay for the move of getting the layout more like a data center," Morris said.

Corning is trying to make it known that it would like to work towards creating more relationships with MSO customers. "(We want to) make sure we are communicating what the industry needs, not what we think it needs. We want to partner with SCTE and anyone else who wants to reduce energy," Morris said.