Thread Links Date Links
Thread Prev Thread Next Thread Index Date Prev Date Next Date Index

Re: XAUI jitter tolerance




Hi Allan


Allan Liu wrote:
> 
> Hi Mike,
> 
> I agree with the general premise of your first point.  The jitter
> tolerance mask is a minimum spec, not a maximum spec.  So yes, a
> manufacturer can indeed set the receiver bandwidth as high as desired
> without violating the spec.  Many people have indicated during the
> Austin meeting as well as on the reflector that today's receivers are
> operating with bandwidths in the 3-5MHz range or higher.  So if that's

During Austin meeting majority of people in the jitter subgroup stated
they don't want to make the change.  I don't disagree with your
statement
regarding majority of receivers have bandwidth higher than MJS
Baudrate/1667.
The baudrate/1667 today is entrenched and IB has already defined and
some 
SerDes would operate dual speed.  Changing the Bandwidth this late will
impact some SerDes.

Historically some of FC/GBE SerDes had high bandwidth receivers often 
to overcome their high jitter peaking.  I am not sure this is the 
best approach anyway.  


> the case, why not change the spec to reflect what people are doing? If
> today's receivers are able to track all that low frequency jitter, why
> do we want to unnecessarily force the transmitter designers to
> over-design their circuits? Bottomline is that if receivers are able to
> track low frequency jitter, the transmitters should be be to transmit
> the low frequency jitter without affecting system performance.  Agilent
> is not proposing anything drastic or exclusionary.  Given that a
> receiver with a higher bandwidth performs better than one with a lower
> bandwidth, people will naturally try to implement receivers with as high
> a bandwidth as feasible.  And from our experience as well as from what
> other people in the industry are saying, a reasonable bandwidth for
> receivers today is 3-5MHz, which is what Agilent is proposing to move
> the "break" frequency to.  This will not exclude anybody's design and it
> will certainly ease the design of transmitters, which is good for
> everybody, old players as well as young startups.  In addition, it'll
> increase the level of integration possible, which again is good for the
> industry.  I really do not see any downside to this proposal,
> techincally or economically.

As a designer you can still design your SerrDes for whatever bandwidth 
you wish.  Increasing the bandwidth only relaxes jittery transmitters
and effectively reducing user margin assuming most receiver have higher
bandwidth.  I don't see how the level of integration can be increased
substantially by doubling the frequency.   
> 
> I disagree with your second point.  If you look at Annex G of the Fibre
> Channel MJS document, you'll find the derivation of Fc/1667.  Sonet
> defines the upper corner frequency to be baud rate divided by 2500.  It
> also specifies that the amplitude of the sinusoidal jitter to be applied
> above this frequency is .15UI.  And below this freqency, the applied
> jitter shall increase 20dB/decade.  Fibre Channel specifies that for
> high frequencies, the amplitude of the sinusoidal jitter shall be
> .10UI.  If you extend the line with the slope 20dB/decade until you hit
> the .10UI horizontal line, you'll get Fc/1667.  This is more clearly
> document in the MJS document(Rev10, 6/9/1999) which can be obtained at
> www.t11.org.  A similar calculation as the one you shown is also
> provided in the MJS document.  However, that calculation is a check of,
> not a basis of, the break frequency numbers.  If I recall correctly, the
> annex showed that with the "break" frequencies as chosen, the 100ppm
> constraint on the RefClk will not violate them.  And if you look at
> Agilent's presentation from the Austin meeting, making the change will
> not violate the 100ppm constraint on RefClk either.

About 3 years ago I also proposed to FC MJS to increase the bandwidth
for the same reason your are proposing today.  The MJS group did not
accept my proposal and I went back and fixed my jittery transmitter.
If we start making changes for comfort I am not sure where we will end.

Is the specification broken?

IEEE D2 accepted?

Should we make a change now?

Thanks,

Ali Ghiasi

Broadcom

> 
> Regards,
> 
> -Allan
> 
> Mike Jenkins wrote:
> >
> > Allan,
> >
> > Nothing in the presently proposed jitter spec (1.8 MHz) prohibits
> > any manufacturer from setting the receiver bandwidth as high as
> > he wants.  It only prohibits bandwidths below that value.  We don't
> > need to increase the "break" frequency.  The only thing that would
> > do is legalize transmitters with worse low frequency jitter.
> >
> > FWIW, here's the rationale used in Fibre Channel for the Fc/1667
> > "break" frequency spec:  It's based on a ref clock's low frequency
> > jitter and "wander".  This is effectively limited by the frequency
> > tolerance spec (say 100 ppm).  The worst case is that the frequency
> > slam back and forth between these extremes:
> >
> >         F = Fref * (1 +/- 0.0001)               cycles/sec
> >
> > The resulting phase error (the integral of frequency error) is:
> >
> >         (Tm/2) * 0.0001 * Fref          cycles of the REFCLK
> >
> > where Tm = 1/Fm, and Fm is the rate at which the REFCLK frequency
> > slams back and forth between the 100 ppm limits.
> >
> > If the REFCLK is 1/N of the bit rate (N is generally 10 or 20),
> > then this peak-peak jitter as a percent of the bit period is
> >
> >         (Tm/2) * 0.0001 * Fref * N      cycles of the bit-rate clock
> >
> > For example, at a modulation rate that is 1/1667 of the bit rate
> > (1.875 MHz for 3.125 Gbps), the jitter is 0.083 of the 320 pS baud.
> > At higher modulation rates, the jitter is proportionally smaller,
> > and at lower modulation rates, the jitter is proportionally larger.
> >
> > Regards,
> > Mike
> >
> > Allan Liu wrote:
> > >
> > > Hi,
> > >
> > > A jitter specification has been proposed by Ali Ghasi of Broadcom for
> > > XAUI.  At the December Jitter Meeting in Austin, Texas, Agilent made a
> > > proposal for an improvement.  Those presentations, as well as a summary
> > > of the meeting by Anthony Sanders(facilitator of the XAUI Jitter Ad-Hoc)
> > > are available at:
> > > http://www.ieee802.org/3/ae/public/adhoc/serial_pmd/documents/index.html
> > >
> > > Our proposal is to increase the "break" frequency of the jitter
> > > tolerance mask.  Currently, the proposal puts that "break" frequency at
> > > 1.8MHz.  We are proposing to push it up higher to 3-5MHz.
> > > There are four reasons why we believe the time is ripe for making this
> > > change:
> > >
> > > 1) The current "break" frequency was derived from Fibre Channel which
> > > derived it from Sonet; it's equal to the fundamental frequency divided
> > > by 1667.  I have looked long and hard and I have not found any
> > > documentation as to why this number was picked.  If anybody has an
> > > explanation, there is massive group of people waiting to hear the
> > > answer, including myself.  In the meanwhile, let me just repeat what I
> > > have heard from different people; this is in line with what Larry Devito
> > > of Analog Devices has posted to the reflector before.  Current wisdom
> > > has it that this number had to do with old SAW filter technology that
> > > was used at the time Sonet was created.  In addition, Sonet had to
> > > contend with the inherent problems with using regenerators in the system
> > > and thus had to make their jitter specs more stringent.  And to be
> > > compatible with older systems, today's Sonet systems are designed to the
> > > same old spec.  Fibre Channel comes along and copies this spec from
> > > Sonet.  Infiniband comes along and copies it from Fibre Channel.  And
> > > now, XAUI comes along and also wants to copy it from Fibre Channel.  And
> > > nobody knows why! XAUI is brand new and does not carry any old baggage.
> > > We have a chance to do it right and to write the specification to
> > > reflect current technologies and current implementations.
> > >
> > > 2) Today's Fibre Channel systems use receivers with a much higher
> > > bandwidth.  My measurements show that they are in the 3-5MHz range.
> > > During the December meeting, Jeff Cain of Cisco said their 1G Ethernet
> > > systems are using SerDes with bandwidths in the same range.  And all
> > > these systems are working perfectly.  So why do we continue to limit
> > > ourselves to a legacy specification that everybody exceeds? We should
> > > write the XAUI spec to reflect what people are implementing and what
> > > makes sense.  And from my understanding of how receivers work, a higher
> > > bandwidth equals better performance.
> > >
> > > 3) Other technologies do not necessarily have the same ambitious cost
> > > structure as Ethernet historically has.  XAUI is suppose to be a low
> > > cost interface to connect the optics to the MAC.  At 3.125G, it is not
> > > easy to build a functional system.  Increasing the "break" frequency
> > > means that the receiver is able to track more jitter.  This will make it
> > > easier for everybody to meet specification and produce a fuctional
> > > system.  And of course, this will drive down the cost of each port.  But
> > > the crucial point is not cost, but that increasing the break frequency
> > > will NOT impact system performance in any negative way.
> > >
> > > 4) Increasing the "break" frequency will also make it easier and cheaper
> > > for the integration of XAUI into bigger chips, like MACs.  Integration
> > > can mean 1 channel of XAUI or 100 channels of XAUI.  Obviously the
> > > current generation of XAUI will be discrete, but from the days of HARI,
> > > integration has been the goal.  We must not forget this goal as we move
> > > forward with the specification.  And again, this goes back to the point
> > > about XAUI being a low cost interface.  Increasing the bandwidth means
> > > the ability to use a smaller filtering cap in the PLL which means a
> > > higher level of integration can be achieved.
> > >
> > > I believe the result from the straw poll during the December meeting is
> > > short-sighted.  There is nothing scary or dangerous about this change.
> > > It is certainly a logical and much needed change to reflect what our
> > > technology can do and what the market is producing.  There is no need to
> > > copy other standards, especially if those standards have no technical
> > > basis for their jitter tolerance numbers.  However, there are clear
> > > reasons, as listed above, as to why these numbers need to be changed.
> > > Any claims that XAUI can leverage from this standard and that standard
> > > are unfounded.  There are no standards out there that has the same
> > > goals as XAUI or have the same fundamental frequency as XAUI.  And
> > > why should XAUI continue to carry the old baggage as the other
> > > standards do? XAUI was suppose to be easy.  After all, 3.125G is only
> > > only 625MHz from 2.5G.  Wrong! XAUI chips have yet to be abundantly
> > > available.  This suggests that they are much harder to implement than
> > > people previously thought.  Any thoughts of XAUI being easy because it's
> > > "similar" to other standards are simply preposterous.
> > >
> > > This message is to open up this discussion to a wider audience and to
> > > get people thinking about this issue as we approach the next meeting.
> > > Any and all input on this matter is appreciated.
> > >
> > > Regards,
> > >
> > > -Allan
> >
> > --
> > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> >  Mike Jenkins               Phone: 408.433.7901            _____
> >  LSI Logic Corp, ms/G715      Fax: 408.433.7461        LSI|LOGIC| (R)
> >  1525 McCarthy Blvd.       mailto:Jenkins@LSIL.com        |     |
> >  Milpitas, CA  95035         http://www.lsilogic.com      |_____|
> > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~