Tektronix Technical Forums are maintained by community involvement. Feel free to post questions or respond to questions by other members. Should you require a time-sensitive answer, please contact your local Tektronix support center here.
Desktop PC <-- (GPIB) --> 2651A <-- TSP Link --> 2651A
No external hardware trigger used
Both current source and voltage source are used. Also, we measure both parameter with 2651A
Keithley Series 2600/2600A/2600B Native LabVIEW 2009 Instrument Driver version 2.5.0
https://jp.tek.com/source-measure-units ... strument-d
I want to know, "How much there is a jitter when the device start execution of script sent from LabVIEW software."
For example, I suppose following situation.
1. operator launch LabVIEW application
2. operator press start button and the script will be sent (timer start 0:00)
3. the device receive the script (0:xx)
4. the device execute the first line in the script (0:xx)
between 3 to 4, how much time would be?
Why Im asking above question is that in general, our application works fine.
this application has another SMU device and it sources voltage, 2651A does measurement.
When SMU sourcing 5V, 2651A can read the value but rarely, it read "0V".
We digging in this problem and understood a little. it seems that measurement timing has a jitter 40msec to 80msec.
Why those jitter is there?
Who is online
Users browsing this forum: No registered users and 2 guests