Calibrating Device Input Levels

Calibrating Input Voltage

  • Warning: Before connecting anything to your audio device, make sure that it will not send voltages into your device that are too high. Most audio input devices should handle sinusoidal (single frequency) signal levels of up to 1 Vrms, although the headset mic input may accommodate much lower input voltages. It's best to start low and increase input levels as necessary to avoid damaging your device.
  • If you already know the voltage sensitivity of your input device (in V/FS), you can enter it directly in the Input Sensitivity text box of the device calibration view and skip the rest of the calibration procedure.
  1. Connect the output of an AC voltage reference generator to the first available input channel of the current input device (headset or Lightning input). Remember, we're talking about low AC voltages (e.g. 1 Vrms, or less).
    • If you have a calibrated AC rms voltmeter handy, and you have the Advanced or Pro Tool Set subscription, you can produce the reference signal with SignalScope's signal generator. In that case, you'll need to measure the output voltage with your voltmeter, so you know what voltage you are applying to the input.
    • Make sure the signal generator is turned on (in the Sig Gen tab) before proceeding to the next step.
    It might be a good idea to check the input signal in the Oscilloscope display to be sure the signal looks good and that there is no visible distortion of the waveform.
  2. Open the I/O Device Configuration menu.
    • The I/O Configuration menu is directly accessible from the left side of the toolbar by tapping the microphone icon.
  3. In the top row of the Input Options table, make sure the desired input device is selected (its name will be shown in parentheses).
  4. Make sure the Device Units selection is set to V (Volts).
  5. If you're using SignalScope's signal generator for a reference source, make sure Audio Play Through is switched off.
  6. Tap Calibrate.
  7. The Calibration view displays the current measured input voltage as a text value as well as in a horizontal bar meter. If the current measured input voltage agrees with the reference voltage you have applied to the input (and there's no indication of input clipping), then you're done--your input device already has the correct voltage sensitivity.
  8. Make sure your input is not clipping (you should not see the word "clip" displayed with a red box around it in the upper left-hand corner of the screen). If it is, you will need to decrease the voltage of your reference input signal. When the input signal is no longer clipping, the red box around the word "clip" will disappear.
  9. Enter the rms voltage of the reference input signal into the Ref. Input Level text box.
  10. Wait a few seconds for the measured level to stabilize.
  11. Tap the Calibrate button.
  12. Confirm that you would like to Calibrate.
  13. Now check to see that the measured input level closely matches the level of the reference signal. If it does, your device is properly calibrated.
    • Tapping the Calibrate button causes SignalScope to automatically calculate a proper sensitivity (of volts relative to full scale input, or V/FS) for the input device, based on the current measured input level and the Ref. Input Level value.
  • It is possible to adjust the input gain of some external input devices. By default, the existing input gain of the device, before SignalScope was launched, is preserved. Once you change the gain in SignalScope, it will keep that gain setting until you change it again. Changing the gain setting will generally require adjusting or re-calibrating the input voltage sensitivity.

    The Gain setting can be used to adjust the measurement range. To measure higher signal levels, choose a lower gain setting.