Thread Links Date Links
Thread Prev Thread Next Thread Index Date Prev Date Next Date Index

Re: [10GMMF] Notes from Aug 3rd Meeting on TP3 Definition



Hi - after some conversation with Mike, here is my view of TP3 (& 2) testing and some suggestions. Much of this has already been expressed by others.
 
Goals
I have 3 goals that are not mutually exclusive:
1. The tests should reasonably assure interoperability;
2. The tests used should be common and consistent among all users;
3. The tests should be as simple as possible.
Are there more?
 
Questions to ask:
Can we agree with a set of goals?
Can we get some agreement on what is meant by "reasonably". There is no such thing as the perfect test, but how rigorous do we want to be?
 
An important point - there is a difference between rigor and complexity. I generally tend towards rigor, but also prefer simplicity. Due to schedule pressures, 802.3ae did not find the right combination, and in fact, it could be argued that we did not do completely well in either aspect.
 
Willingness to compromise?
I believe the philosophical positions in our case are approaching the goals from different starting points. As long as the group that wants to start with simple tests is willing to add components as justified by rigor, and as long as the group that want so start with complex tests is willing to remove test components as justified by simplicity, we should be able to get through this. So to me, I want to stand in the middle as long as I sense a good attitude of willingness from each direction.
 
An approach
For me to progress, I need more information. Perhaps we can approach this apparent impasse by examining each test component that is at issue:
1. Magnitude or significance (in dB, I suppose) and how variable the magnitude is with product implementation.
2. Test implementation (setup, method - with a view on complexity, cost, time to test, etc.); Lew has suggested this already.
3. For simplification, can rigor still be assured if the test is replaced by a fixed OMA penalty term? Would the penalty be fair to all product implementations?
 
If we do this for the test components that are at issue (dynamic penalty, RIN - are there others?), hopefully we'll be able to gather enough information for some of us to make informed decisions. Should a smaller group be formed for each test component being discussed?
 
Finally, even if we do wind up with a complex test, I would like to see the document informatively (Annex?) recommend/document ways to simplify tests with extracted components (assuming it can be justified). This would encourage the goal of common tests.
 
Tom Lindsay
ClariPhy
tlindsay@ieee.org
phone: (425) 775-7013
cell: (206) 790-3240
 
 
----- Original Message -----
Sent: Thursday, August 05, 2004 12:31 PM
Subject: [10GMMF] Notes from Aug 3rd Meeting on TP3 Definition

Hi Everyone,

Here are the notes from today's first meeting of the TP3 Definition Group

For attendees please see attached spreadsheet. If I missed anyone please let me know and I'll add them to the list.

Notes on Agenda Items
=====================
1. List attendees

Abbot, John
Aronson, Lew
Balasubramonian, Venu
Dawe, Piers
Ewen, John
Fiedler, Jens
Gomatam, Badri
Jaeger, John
Latchman, Ryan
Lawton, Mike
Lindsay, Tom
Pepeljogski, Petar
Popescu, Petre
Rommel, Albrecht
Sauer, Mike
Shanbhag, Abhijit
Sun, Yi
Swenson, Norm
Thon, Lars
Traverso, Matt
Weiner, Nick
Witt, Kevin

2. Discuss Goals/Scope
======================

Some discussion here. Action for Mike to re-word the draft version. This now reads as:-

        o Present a proposal for TP3 specification parameters and associated conformance testing at the September Meeting

In support of this we need to:-

        1. Agree on our Overall Test Philosphy (spelling out normative and informative tests reqd)
        2. Determine and validate specific metrics which will be used to simplify test requirements
        3. Provide detailed proposals regarding each individual test

These goals are my effor to capture the conversation and will be reviewed at the next meeting.

3. Agree Modus Operandi
=======================
        - will meet weekly on Tuesday
        Dial in info:

International Direct Dial (650) 599-0374
Meeting ID:                     801803

        - Smaller sub groups may form as the work takes shape and can be divided up.
        - Post meeting notes and invites on the 10GMMF website

4. Agree Overall Testing Philosphy
==================================

Agreement was not reached. The discussion was focused on the stressed eye test with two viewpoints discussed. The
debate centered around whether the stressed eye test should be an accurate reflection of how to represent all known impairements (which could subsequently be test reduced by the vendor, depending on their design choices) or whether the test should be made as simple as could be defended in order to support making the test practical and repeatable.

I think there was agreement within the group to ulitmately provide a simple informative test but there was not agreement on the complexity vs completeness of the normative test.


Additional agenda items for subsequent meetings are as follows:-
===============================================================

5. Develope detailed work list (& interested parties) for more detailed analysis of individual tests:-

        The following points have been taken from Lew's presentation (link above):-

                - Normative Stressed Rx Sensitivity Test
                        o Is a 2 ray model sufficient as an ISI generator?
                        o What is the best choice for relative peak height and separation?
                        o Is a certain maximum bandwidth important?
                        o Define measurement for OMA
                        o Determine S/N requirements of compliance signal
                        o Define test procedures for calibrating conformance test signal

                - Dynamic Adaptation Test
                        o Determine rate of adaptation required
                        o Determine channel simulator and time varying component
                        o Define exact test method
                        o Choose appropriate allocation for Dynamic Adaptation Penalty

                - Informative Sensitivity Test
                        o Goals for test
                        o Use simple Gaussian ISI or 2 ray model for channel
                        o Detailed parameters of that channel emulator

6. Discuss impact of having the ISI block being prior to the E to O conversion.
                        - Effect of cross products and non-linear ISI

Please let me know if you have any comments/corrections.

Best Regards

Mike

+44 1473 465200