by TIP News

Background

Optical networks need to be operated with sufficient Signal Quality Margin to enable reliable data transport during the lifetime of the equipment. A detailed discussion about margins can be found in (Design of low-margin optical networks, Jan. 2017) from where Figure 1 is depicted:

Figure 1  Types of margins and their evolution.

The example describes how margins get squeezed over time from a scarcely used greenfield Beginning-of-Life Network to a heavily utilized mature network at its End-of-Life. In a properly working network, when reading the SNR from Transponders we would expect its value to be at or above the Top-line margin. However, it is often difficult to decide for a given measurement value in time is accurate and whether sufficient margin is still present to continue the operation until its planned EOL. For example: if a new transponder pair is installed towards the end of the system lifetime but shows only an actual (see Figure 1), it may be considered insufficient, because the transponder end-of-life margin cannot be kept. If however an aged transponder pair is installed in the same system with the same 9dB margin, it could be considered workable since it already used up its end-of-life margin.

 

Optical Network Observer

We define an Optical Network Observer as an application that uses available information about the optical network to calculate for each present wavelength service a target-GSNR. The difficulty lies in the availability and accuracy of available information about the network itself.

Figure 2  Parameter Quality (example)

Figure 2 describes the situation in a typical optical network. Some fiber parameters e.g. dispersion may be available from specification or measurements recorded during installation. Other parameters such as e.g. attenuation can change and measurements are rarely up-to-date.

Where Specifications or sample measurements pre-installation are available, it is possible to obtain basic information. However, by nature specifications define the worst case and need to consider margins that are not disclosed. In contrast, sample measurements provide only a snapshot for a single entity but cannot predict sample-to-sample variations, nor actual EOL margins. Other parameters can be retrieved on demand from the actual optical network such as topology and resource usage. Compiling these parameters of vastly different quality in a single model to predict a Target-Quality of Transmission (QoT) is challenging.

Today, Operators want to understand if their present optical network is performing according to expectation. This is a non-trivial problem since QoT is dependent on spectral occupation and amplifier settings both of which are very different from a full-load scenario. They therefore have no reference point to compare with an actual measured QoT.

 

The Optical Network Observer

  • The Observer is an application add-on to open optical disaggregated systems to visualize the QoT reserves of each wavelength.
  • The purpose of an ONO is to calculate QoT-references for all provisioned wavelength based on the actual spectral occupation of the optical network.
  • The ONO is an application that can either be triggered manually or run in the background reacting on changes in the spectral occupation as a trigger for updating QoT-references.
  • The ONO shall obtain actual parameter information either directly from the network nodes (TXP, ROADM, AMP, online OTDR) or via a deployed Optical-Network-controller and available planning data.

 

 

Use Cases

 

QoT reference calculation

ONO shall use data extracted from the network to calculate QoT estimates for every provisioned service. It is expected that the availability and quality of input data is varying and therefore the accuracy of the QoT-reference. Therefore a confidence interval shall be associated to each QoT-reference.

 

QoT troubleshooting

If , Online OTDR estimated QoT deviates from measured QoT it is often hard to narrow down the root cause. The QoT-observer shall provide means to create what-if-scenarios to understand how they would influence QoT targets. By a variation of these parameters and comparing the QoT results with actual data from the network, a plausible root-cause can be constructed, knowing that a correlation is not a causality.

 

Visibility of Margins

Given the actual measurement and a reference QoT, the Observer should be able to visualize margins per connection for actual QoT and EoL QoT. The assumptions can initially be based on (Design of low-margin optical networks, Jan. 2017) but needs to be refined over time based on additional findings during ONO operations.

 

Dynamic QoT margins

Optical Networks “breathe” and are constantly optimized in control loops. It is therefore normal that measured QoT fluctuates over time. Even a service that precisely matches its QoT target would therefor experience fluctuations. This natural fluctuation should be included into the confidence interval calculations to allow filtering for temporary variations.

 

Calibration of Network data

This use case requires to vary the model under the assumption that GNPy gives correct prediction if the input data is correct. Variations of input data can help to better understand e.g. type of fiber, point losses etc.

We also note that OTDR cannot measure the first point loss in fiber, although this is the most important one for QoT prediction. While the use of OTDR certainly helps to improve the quality of the prediction it still can’t eliminate the inaccuracy completely.

 

Works Cited

Design of low-margin optical networks. Pointurier, Y. Jan. 2017. vol. 9, no. 1, pp. A9-A17, Jan. 2017, Vol. Journal of Optical Communications and Networking. doi: 10.1364/JOCN.9.0000A9.