Suggested simulations for LVD timing meeting

Robert Elliott relliott at hobbit.eng.hou.compaq.com
Fri May 21 15:15:52 PDT 1999


* From the T10 Reflector (t10 at symbios.com), posted by:
* relliott at hobbit.eng.hou.compaq.com (Robert Elliott)
*
To: all the pad designers planning to attend the LVD timing meeting 
next week in Colorado Springs


To explore the issues, you may want to run SPICE (or equivalent) on 
your SCSI LVD input receiver and measure the input delays for 
these cases.


1. The best case signal meeting the rules.  

Start level at +175 mV then change almost instantly to -175 mV.  
This should cause the fastest switching possible in the receiver.  

Repeat for a positive edge transition.

2. Worst case signals meeting the rules.  
Total time from 0mV to -175 mV is 3 ns.  

2a.  Start level at +175 mV, then drop to 60 mV instantly, then 
stay level until 3 ns and drop to -175 mV.

2b.  Start level at +175 mV, then drop to 60 mV (taking almost 
3 ns from 0 mV to 60 mV), then drop to -175 mV instantly.

Repeat for positive edge transitions. 


3.  General characterization.

Start the signal at +175 mV.  Switch to these voltage levels:
    +30 mV
    0 mV
    -30 mV
    -60 mV
    -90 mV
    -120 mV
    -150 mV
    -175 mV
with these edge rates:
    -50 mV/ns
    -100    
    -200    
    -300    
    -400    
    -500    
    -1000

and measure how long the receiver takes to switch.

Repeat for positive edge transitions from -175 mV.


Consider the results from 1) and 2).  What if the clock signal has
with the best case waveform and the data signal has the worst case 
waveform.  Assume the worst combination of rising/falling edges
(in DT, they all matter).  Do the receiver input delay differences 
eat up all of the 1.25 ns ASIC setup and hold time?  When the 
ASIC designers do timing analysis, what numbers are they using for 
LVD pad skews?  If using SPICE, are they running best/worst case 
waveforms or assuming some similarity?


Other things to consider:
When "shall detect" is 60 mV, there's probably too much variation
in receiver delay for an ASIC to work correctly if the system
delivers legal but different signals.  It's fair to require more
similarity of signal edges from the system than the current spec
requires, but how can you express that? 

If the  "shall detect" level was specified at 175 mV and "may detect" 
was specified at 0 mV, the ASICs would probably all work fine no matter
how the signals look getting to 175 mV.  However, this puts extra burden 
on the system board/interconnect.   

Is there some level (e.g. 115 mV) where the pad variation is
well within the 1.25 ns setup time, so no additional signal
correlation requirements on the interconnect need to be created?

-- 
Rob Elliott   UNIX mailto:relliott at hobbit.eng.hou.compaq.com    
Houston, TX     PC mailto:Robert.Elliott at compaq.com
*
* For T10 Reflector information, send a message with
* 'info t10' (no quotes) in the message body to majordomo at symbios.com





More information about the T10 mailing list