2017-06-11, 10:49
Does someone tell me why the system graphic of my detector show always a offset of + 50/60mV from zero? Can I adjust this anywhere? In the web interface, a can`t find any adjustment for this! It`s seem any wrong value in the ADC.
ST ID 1846
- Sampling per InputResolution: 12bits, Period: 666ns = 500.0kSPS, Time: 2000ns
- ADC1
- Input1 Amp-Channel 3 , Offset 26mV, no Trigger
- Input2 Amp-Channel 4 , Offset 30mV, Trigger -90mV/90mV
- Input1 Amp-Channel 3 , Offset 26mV, no Trigger
- ADC2
- Input1 Amp-Channel 2 , Offset 30mV, Trigger -95mV/95mV
- Input2 Amp-Channel 3 , Offset 25mV, no Trigger
- Input1 Amp-Channel 2 , Offset 30mV, Trigger -95mV/95mV
- ADC3
- Input1 Amp-Channel 1 , Offset 39mV, Trigger -95mV/95mV
- Input2 Amp-Channel 2 , Offset -151mV, no Trigger
- Input1 Amp-Channel 1 , Offset 39mV, Trigger -95mV/95mV
- Thanks for your help!
ST ID 1846