Tektronix Technical Forums are maintained by community involvement. Feel free to post questions or respond to questions by other members. Should you require a time-sensitive answer, please contact your local Tektronix support center here.

Measurement precision increases with source current

Post Reply
heyday33
Posts: 1
Joined: December 7th, 2020, 11:41 am
Country: United States

Measurement precision increases with source current

Post by heyday33 » December 7th, 2020, 11:47 am

Hi everyone!

First post here. So I made an observation with the Keithley 2636 while doing a 4 pt. kelvin resistance measurement and wanted to ask to see if there is an explanation for the behavior I am seeing. This will help me better understand the instrument 2636 and help me get better measurements with it. (I have also been seeing the same trend doing R measurements with my parameter analyzer)

At a source current of 1 mA, I get a resistance measurement that is stable to 0.0XXXX Ohms. i.e. the measurement varies in order of tens of milliohms. With an increased source current of 10 mA, the precision of my measurement seems to get better, and the R that I read is stable to the third decimal pt (variation is in order of milliohms).

I do not understand if this variation I see is noise? Could you help me understand why the variation is decreasing with increase in source current? (if it is indeed noise, it should be constant no matter what source current, or increase with source current). I wanted to know if this is because I am reducing the instrument sensitivity to voltage?

The instrument voltage resolution is 1-2 micro-volts, shouldn't I be getting a measurement precise to a 0.001 milli - volts when source current is in milli-Amp range (regardless of 1 mA or 10 mA)?

(In this whole experiment, I do not specify the measurement range of the instrument (I only specify the limits and source current). )

Andrea C
Keithley Applications
Keithley Applications
Posts: 1548
Joined: October 15th, 2010, 10:35 am
Country: United States
Contact:

Re: Measurement precision increases with source current

Post by Andrea C » December 9th, 2020, 6:24 am

Is it just signal to noise ratio improves with increasing current and therefore an increased signal level?
For example, if the DUT is 1Ω, at 1mA you have only 1mV of response to measure. At 10mA, you have 10mV.

Using the 200mV measure range on 2636B, the errors compute out this way:
2636B_200mV_spec.PNG
compute error budget 200mV range
2636B_200mV_spec.PNG (4.07 KiB) Viewed 165 times
The 10mV signal level has lower error, lower standard deviations on N samples, etc.

Post Reply

Return to “General Discussion”

Who is online

Users browsing this forum: No registered users and 2 guests