Thread Links Date Links
Thread Prev Thread Next Thread Index Date Prev Date Next Date Index

Re: [802.3_25GSMF] Minimum channel insertion loss for 25 Gb/s 40km



David,

The minimum allowable loss has worked in my experience.  It is not dependent on the channel length, but simply on the measured attenuation.  The loss minimum can be reached with the use of a lot of connection loss, fiber attenuation, or a combo of both. 

 

While Tom’s suggestion of a minimum length of cable provides a handy “rule of thumb”, it is going to leave a gap between 10km LR reach and minimum ER cable length, at least in perception. 

 

In practice, if the loss is below the minimum, then an in-line attenuator chosen to bring the loss into compliance,  is applied at a patch panel. 

 

Regards,

Paul

 

From: Mcdermott, Thomas [mailto:tom.mcdermott@xxxxxxxxxxxxxx]
Sent: Monday, February 29, 2016 5:23 PM
To: STDS-802-3-25GSMF@xxxxxxxxxxxxxxxxx
Subject: Re: [802.3_25GSMF] Minimum channel insertion loss for 25 Gb/s 40km

 

Hi Mike,

 

The most likely minimum attenuation at 1310 would be about 0.34 db/km.  Thus if the overload spec (Tx –Rx) >= 6.3 dB of loss, then 18.5 km is guaranteed to provide at least enough attenuation to prevent overload.

 

One possible practice then is to say:

 

If the link is shorter than 18.5 km then measure the optical power at the receiver, and insert appropriate attenuator if/as needed.

 

We may need also to address the case of overload damage, obviously best if the receiver overload damage level is >= maximum possible transmitter power.

 

While it’s not necessarily how 40G is handled, it was a recommendation in non-Ethernet cases.

 

-- Tom

 

 

From: David Lewis [mailto:David.Lewis@xxxxxxxxxxxx]
Sent: Monday, February 29, 2016 1:13 PM
To: STDS-802-3-25GSMF@xxxxxxxxxxxxxxxxx
Subject: [802.3_25GSMF] Minimum channel insertion loss for 25 Gb/s 40km

 

All,

 

During the adhoc last week, I presented a strawman for 40km with a minimum channel insertion loss of 10 dB, and a maximum of 18 dB.  One suggestion, that I think came from Mike Dudek, was to change the minimum loss from 10 dB to 6.3 dB so that there would be a continuous range of solutions, with the 10km part covering losses of 0 to 6.3 dB and the 40km part covering 6.3 to 18 dB.

 

Without actual values for Tx maximum launch power and Rx overload power, I can’t say yet whether 10 or 6.3 are feasible numbers.  However I wanted to put forth the idea that the 40km module with an APD receiver will have a maximum Rx input power tolerance that is lower than the maximum Tx output power.  Therefore, there will always be a need to make sure that the Rx input is attenuated sufficiently before plugging the fiber into the receiver.  This is particularly true for lab testing where it is common practice to use an optical loopback from Tx output to Rx input in order to check out the Rx output electrical signal.

 

Are there use cases for normal deployment where it would be necessary to measure the power level in the incoming fiber, or can we always rely on the channel having at least some number of dB of loss?

 

I was referred to the case of 40GBASE-ER4, which has a maximum average receive power of -4.5 dBm but a maximum average transmitter launch power of +4.5 dBm.  I assume this means we could need up to 9 dB loss for proper operation.  Can someone provide examples of how this is dealt with in practical deployments of 40GBASE-ER4 modules?

 

Thanks,

David Lewis