• Datasheets and probability

    From John S@21:1/5 to All on Sat Dec 14 19:50:39 2024
    Hi, men -

    There are sometimes 3 columns on a datasheet which may contain min, nom,
    and max values. Like Vds for example.

    Is there any probability tacitly assigned to the values? I know that
    nominal is the average, and I assume that the value lies between the two
    middle one standard deviation points.

    1. Is that a valid assumption?

    2. Is the min between one and two standard deviations down from the mean?

    Thanks and I apologize for asking such a basic question in this
    professional group.

    Cheers,
    John

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to John S on Sat Dec 14 23:39:26 2024
    On 12/14/2024 6:50 PM, John S wrote:
    There are sometimes 3 columns on a datasheet which may contain min, nom, and max values. Like Vds for example.

    Is there any probability tacitly assigned to the values? I know that nominal is
    the average, and I assume that the value lies between the two middle one standard deviation points.

    1. Is that a valid assumption?

    2. Is the min between one and two standard deviations down from the mean?

    Min and Max are "guaranteed" limits (as much as anything can be "guaranteed"). Typ(ical) is where the *process* tends to produce the most yield.

    That can change, over time. But, it will always be constrained by [min,max].

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Bill Sloman@21:1/5 to John S on Sun Dec 15 17:33:05 2024
    On 15/12/2024 12:50 pm, John S wrote:
    Hi, men -

    There are sometimes 3 columns on a datasheet which may contain min, nom,
    and max values. Like Vds for example.

    Is there any probability tacitly assigned to the values? I know that
    nominal is the average, and I assume that the value lies between the two middle one standard deviation points.

    1. Is that a valid assumption?

    2. Is the min between one and two standard deviations down from the mean?

    Thanks and I apologize for asking such a basic question in this
    professional group.

    I doubt very much that this hasever been spelled out, but a span of two standard deviations above and below the mean would include 95% of a
    randomly distributed variable, which would mean that one customer in
    twenty might complain about getting an out-of-spec component.

    I haven't done it that often, which suggests that the industry has a
    adopted a three standard deviation span which would include 99.5% of a
    randomly distributed variable.

    I have bitched about duff components enough to fit that, but most of
    them were fine at room temperature and only fell out of spec after
    they'd warmed up, or to be more precise, after they'd been stuck in a
    box in an enclosed rack and had had time to warm up a bit.

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Uwe Bonnes@21:1/5 to John S on Sun Dec 15 12:11:10 2024
    John S <Sophi.2@invalid.org> wrote:
    Hi, men -

    There are sometimes 3 columns on a datasheet which may contain min, nom,
    and max values. Like Vds for example.

    Is there any probability tacitly assigned to the values? I know that
    nominal is the average, and I assume that the value lies between the two middle one standard deviation points.

    1. Is that a valid assumption?

    2. Is the min between one and two standard deviations down from the mean?

    Thanks and I apologize for asking such a basic question in this
    professional group.

    Parameters often have a gaussian distribution, You then can expect typ
    values as the center of the bell curve and min/max values are cut
    offs. With cut off at 1 sigma, about 35 % of the part would be out of tolerance, so cut off is at a higher sigma value, but manufacturers
    will not tell you ar what value it is.
    --
    Uwe Bonnes bon@elektron.ikp.physik.tu-darmstadt.de

    Institut fuer Kernphysik Schlossgartenstrasse 9 64289 Darmstadt
    --------- Tel. 06151 1623569 ------- Fax. 06151 1623305 ---------

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to John S on Sun Dec 15 13:08:02 2024
    On 15/12/2024 01:50, John S wrote:
    Hi, men -

    There are sometimes 3 columns on a datasheet which may contain min, nom,
    and max values. Like Vds for example.

    Is there any probability tacitly assigned to the values? I know that
    nominal is the average, and I assume that the value lies between the two middle one standard deviation points.

    1. Is that a valid assumption?

    2. Is the min between one and two standard deviations down from the mean?

    Thanks and I apologize for asking such a basic question in this
    professional group.

    It is actually a rather deep and difficult to answer question in general because for some components the answer can be "it depends".

    Semiconductors I generally take it to mean ~3 sigma either side of the
    mean but design with a bit of extra margin so the 0.5% tail doesn't
    cause trouble.

    But for some components like resistors and capacitors that may be
    obtained in 10%, 5%, 2%, 1%, 0.1% tolerances you can find that the
    frequency distribution of the components in the wider tolerance bins
    consists of values that are almost *never* inside the narrower ones.
    IOW you are guaranteed at least 2% error in the 5% parts.

    It is a bit better today than it used to be when they made batches and
    then selected from the process output. These days it is all a lot more reproducible and laser trimmed for precision parts.

    --
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phil Hobbs@21:1/5 to Martin Brown on Sun Dec 15 13:59:15 2024
    Martin Brown <'''newspam'''@nonad.co.uk> wrote:
    On 15/12/2024 01:50, John S wrote:
    Hi, men -

    There are sometimes 3 columns on a datasheet which may contain min, nom,
    and max values. Like Vds for example.

    Is there any probability tacitly assigned to the values? I know that
    nominal is the average, and I assume that the value lies between the two
    middle one standard deviation points.

    1. Is that a valid assumption?

    2. Is the min between one and two standard deviations down from the mean?

    Thanks and I apologize for asking such a basic question in this
    professional group.

    It is actually a rather deep and difficult to answer question in general because for some components the answer can be "it depends".

    Semiconductors I generally take it to mean ~3 sigma either side of the
    mean but design with a bit of extra margin so the 0.5% tail doesn't
    cause trouble.

    Nah, all parts nowadays follow their spice models exactly.

    Cheers

    Phil “alias Dr Tung In Chic” Hobbs


    --
    Dr Philip C D Hobbs Principal Consultant ElectroOptical Innovations LLC / Hobbs ElectroOptics Optics, Electro-optics, Photonics, Analog Electronics

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From john larkin@21:1/5 to '''newspam'''@nonad.co.uk on Sun Dec 15 07:56:06 2024
    On Sun, 15 Dec 2024 13:08:02 +0000, Martin Brown
    <'''newspam'''@nonad.co.uk> wrote:

    On 15/12/2024 01:50, John S wrote:
    Hi, men -

    There are sometimes 3 columns on a datasheet which may contain min, nom,
    and max values. Like Vds for example.

    Is there any probability tacitly assigned to the values? I know that
    nominal is the average, and I assume that the value lies between the two
    middle one standard deviation points.

    1. Is that a valid assumption?

    2. Is the min between one and two standard deviations down from the mean?

    Thanks and I apologize for asking such a basic question in this
    professional group.

    It is actually a rather deep and difficult to answer question in general >because for some components the answer can be "it depends".

    Semiconductors I generally take it to mean ~3 sigma either side of the
    mean but design with a bit of extra margin so the 0.5% tail doesn't
    cause trouble.


    I can imagine that production test and trim create a nonsymmetric
    distribution of some spec, and chop off the tails. And make a hole in
    the middle where premium parts have been binned away to sell at a
    higher price.


    But for some components like resistors and capacitors that may be
    obtained in 10%, 5%, 2%, 1%, 0.1% tolerances you can find that the
    frequency distribution of the components in the wider tolerance bins
    consists of values that are almost *never* inside the narrower ones.
    IOW you are guaranteed at least 2% error in the 5% parts.

    That doesn't seem to be the pattern nowadays. We buy ultra-cheap 1%
    resistors by the reel, and they are really 0.1% resistors. In a
    production line, with laser trimming, seems like all the resistors are
    right on.

    There's no reason to buy surface-mount resistors that are spec'd worse
    than 1%. It's easier to design knowing all the various Rs in stock are
    better than 1%.

    What we pay extra for is low tempco. And thoe are usually way better
    than their spec. Tempco is a nuisance to test for, at the PPM level.

    Caps are not usually laser trimmed so are not as good. And they have
    plenty of other issues that work against accuracy.




    It is a bit better today than it used to be when they made batches and
    then selected from the process output. These days it is all a lot more >reproducible and laser trimmed for precision parts.

    Some part data sheets have histograms, which can hint at what you'll
    see in real life.

    Abs max voltage ratings are especially interesting. Most silicon
    mosfets seem to zener reliably at 1.2x abs max drain voltage. Some
    have gate zeners, some don't. Some fets and RF parts are reliable at
    2x abs max data sheet voltage.

    RF schotttky detector diodes can have silly ratings like 2 volts max
    reverse. If you need their absurdly low drop and capacitance, you've
    got to test them and break the rules.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Jones@21:1/5 to John R Walliker on Mon Dec 16 21:07:25 2024
    On 15/12/2024 11:44 pm, John R Walliker wrote:
    On 15/12/2024 12:11, Uwe Bonnes wrote:
    John S <Sophi.2@invalid.org> wrote:
    Hi, men -

    There are sometimes 3 columns on a datasheet which may contain min, nom, >>> and max values. Like Vds for example.

    Is there any probability tacitly assigned to the values? I know that
    nominal is the average, and I assume that the value lies between the two >>> middle one standard deviation points.

    1. Is that a valid assumption?

    2. Is the min between one and two standard deviations down from the
    mean?

    Thanks and I apologize for asking such a basic question in this
    professional group.

    Parameters often have a gaussian distribution, You then can expect typ
    values as the center of the bell curve and min/max values are cut
    offs. With cut off at 1 sigma, about 35 % of the part would be out of
    tolerance, so cut off is at a higher sigma value, but manufacturers
    will not tell you ar what value it is.

    In the case of some TI op-amps there are different grades for parameters
    like offset voltage.  They mark the package of the higher spec versions before testing them and then throw away any that don't meet that higher spec.  I was told this by a TI applications engineer.
    Apparently it is cheaper to throw away a few op-amps than to have
    branches in the production line to cope with different grades.
    This does suggest that most devices are much closer to the typical
    values than one might expect from the limiting values.
    For parameters that take a long time to test the typical values may
    be much better than the limit values.
    John


    Another approach that I have heard of is to test all of the parts to the tightest limits (which it would have been designed to pass), and see
    which grade the customers order, only have to keep one pile of stock,
    and then mark them with whichever grade the customers ordered and paid
    for. I don't know whether this actually happens much. If it does, then
    it would not suit the manufacturer for customers to know.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Jones@21:1/5 to John S on Mon Dec 16 21:23:48 2024
    On 15/12/2024 12:50 pm, John S wrote:
    Hi, men -

    There are sometimes 3 columns on a datasheet which may contain min, nom,
    and max values. Like Vds for example.

    Is there any probability tacitly assigned to the values? I know that
    nominal is the average, and I assume that the value lies between the two middle one standard deviation points.

    1. Is that a valid assumption?

    2. Is the min between one and two standard deviations down from the mean?

    Thanks and I apologize for asking such a basic question in this
    professional group.

    Cheers,
    John

    For anything that has been trimmed (whether or not you know that it has
    been trimmed, especially digitally trimmed), assuming that parts will
    have a gaussian distribution in parameters is a mistake. If the part can auto-calibrate itself in use, even more so.

    Also, batch-to-batch variation often exceeds part-to-part variation
    within a batch. They might not know the batch-to-batch variation at the
    time when they are writing the datasheet, as there is probably only one
    batch. They can run "skew lots" where they ask the fab to deliberately
    adjust the process parameters of some wafers to the upper and lower
    limits of some parameters, but this is never exhaustive, and so the
    limits in the datasheet will likely be sand-bagged (overly cautious) to
    some degree.

    Also, specs that are not a major selling point would likely be chosen to
    be very easy to meet, because they really do not want to be throwing
    away parts because of a spec that nobody cares about. For example, if
    you buy a low-noise amplifier, you might reasonably expect that the
    noise figure specification (which involves a trade-off with power and/or
    chip area) is chosen such that they can meet it on say 99% of the
    untested parts, so some of them will be only just passing by a margin
    equal to the uncertainly of the tester calibration. On the other hand,
    if there is a CMOS logic enable pin of the low-noise amplifier which
    typically has a leakage of a couple of femtoamps, it might be specced
    with a maximum leakage of 10 microamps, just because it would be stupid
    to throw out one of these amplifiers if the package was slightly dirty
    and leaked a picoamp instead of a femtoamp, and unreasonably expensive
    to configure the tester to be able to tell the difference.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Klaus Kragelund@21:1/5 to Chris Jones on Mon Dec 30 14:37:59 2024
    On 16-12-2024 11:23, Chris Jones wrote:
    On 15/12/2024 12:50 pm, John S wrote:
    Hi, men -

    There are sometimes 3 columns on a datasheet which may contain min,
    nom, and max values. Like Vds for example.

    Is there any probability tacitly assigned to the values? I know that
    nominal is the average, and I assume that the value lies between the
    two middle one standard deviation points.

    1. Is that a valid assumption?

    2. Is the min between one and two standard deviations down from the mean?

    Thanks and I apologize for asking such a basic question in this
    professional group.

    Cheers,
    John

    For anything that has been trimmed (whether or not you know that it has
    been trimmed, especially digitally trimmed), assuming that parts will
    have a gaussian distribution in parameters is a mistake. If the part can auto-calibrate itself in use, even more so.

    Also, batch-to-batch variation often exceeds part-to-part variation
    within a batch. They might not know the batch-to-batch variation at the
    time when they are writing the datasheet, as there is probably only one batch. They can run "skew lots" where they ask the fab to deliberately
    adjust the process parameters of some wafers to the upper and lower
    limits of some parameters, but this is never exhaustive, and so the
    limits in the datasheet will likely be sand-bagged (overly cautious) to
    some degree.

    Also, specs that are not a major selling point would likely be chosen to
    be very easy to meet, because they really do not want to be throwing
    away parts because of a spec that nobody cares about. For example, if
    you buy a low-noise amplifier, you might reasonably expect that the
    noise figure specification (which involves a trade-off with power and/or
    chip area) is chosen such that they can meet it on say 99% of the
    untested parts, so some of them will be only just passing by a margin
    equal to the uncertainly of the tester calibration. On the other hand,
    if there is a CMOS logic enable pin of the low-noise amplifier which typically has a leakage of a couple of femtoamps, it might be specced
    with a maximum leakage of 10 microamps, just because it would be stupid
    to throw out one of these amplifiers if the package was slightly dirty
    and leaked a picoamp instead of a femtoamp, and unreasonably expensive
    to configure the tester to be able to tell the difference.


    A guy at EEVblog forum has done a lot of measurements on resistors:

    https://www.eevblog.com/forum/projects/smd-resistor-distributions/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)