|Thread Links||Date Links|
|Thread Prev||Thread Next||Thread Index||Date Prev||Date Next||Date Index|
During the adhoc last week, I presented a strawman for 40km with a minimum channel insertion loss of 10 dB, and a maximum of 18 dB. One suggestion, that I think came from Mike Dudek, was to change the minimum loss from 10 dB to 6.3 dB so that there would be a continuous range of solutions, with the 10km part covering losses of 0 to 6.3 dB and the 40km part covering 6.3 to 18 dB.
Without actual values for Tx maximum launch power and Rx overload power, I can’t say yet whether 10 or 6.3 are feasible numbers. However I wanted to put forth the idea that the 40km module with an APD receiver will have a maximum Rx input power tolerance that is lower than the maximum Tx output power. Therefore, there will always be a need to make sure that the Rx input is attenuated sufficiently before plugging the fiber into the receiver. This is particularly true for lab testing where it is common practice to use an optical loopback from Tx output to Rx input in order to check out the Rx output electrical signal.
Are there use cases for normal deployment where it would be necessary to measure the power level in the incoming fiber, or can we always rely on the channel having at least some number of dB of loss?
I was referred to the case of 40GBASE-ER4, which has a maximum average receive power of -4.5 dBm but a maximum average transmitter launch power of +4.5 dBm. I assume this means we could need up to 9 dB loss for proper operation. Can someone provide examples of how this is dealt with in practical deployments of 40GBASE-ER4 modules?