|Thread Links||Date Links|
|Thread Prev||Thread Next||Thread Index||Date Prev||Date Next||Date Index|
From an operator point of view taking optical measurements before connecting up networks is standard practice and another standard practice is to attenuate the signal so that the receive level is midpoint between the high power alarm and receiver sensitivity of the receiver. This way the interconnects are all set up as per the same standard across the network. So checking for incoming optical power should be a must for all connections.
The most likely minimum attenuation at 1310 would be about 0.34 db/km. Thus if the overload spec (Tx –Rx) >= 6.3 dB of loss, then 18.5 km is guaranteed to provide at least enough attenuation to prevent overload.
One possible practice then is to say:
If the link is shorter than 18.5 km then measure the optical power at the receiver, and insert appropriate attenuator if/as needed.
We may need also to address the case of overload damage, obviously best if the receiver overload damage level is >= maximum possible transmitter power.
While it’s not necessarily how 40G is handled, it was a recommendation in non-Ethernet cases.
During the adhoc last week, I presented a strawman for 40km with a minimum channel insertion loss of 10 dB, and a maximum of 18 dB. One suggestion, that I think came from Mike Dudek, was to change the minimum loss from 10 dB to 6.3 dB so that there would be a continuous range of solutions, with the 10km part covering losses of 0 to 6.3 dB and the 40km part covering 6.3 to 18 dB.
Without actual values for Tx maximum launch power and Rx overload power, I can’t say yet whether 10 or 6.3 are feasible numbers. However I wanted to put forth the idea that the 40km module with an APD receiver will have a maximum Rx input power tolerance that is lower than the maximum Tx output power. Therefore, there will always be a need to make sure that the Rx input is attenuated sufficiently before plugging the fiber into the receiver. This is particularly true for lab testing where it is common practice to use an optical loopback from Tx output to Rx input in order to check out the Rx output electrical signal.
Are there use cases for normal deployment where it would be necessary to measure the power level in the incoming fiber, or can we always rely on the channel having at least some number of dB of loss?
I was referred to the case of 40GBASE-ER4, which has a maximum average receive power of -4.5 dBm but a maximum average transmitter launch power of +4.5 dBm. I assume this means we could need up to 9 dB loss for proper operation. Can someone provide examples of how this is dealt with in practical deployments of 40GBASE-ER4 modules?