A standard for interpreting the control signal input to a VCO or VCF. In the volts/Hz standard, a given change in the input voltage produces a change in the frequency, or cycles/second value (Hz) of the circuit. For example, if a 1V input results in a 1000Hz setting, then 2V yields 2000HZ, 3V yields 3000 Hz, 4V is 4000Hz, and so on. The volts/Hz method is easier to implement in hardware than the volts/octave method, but it has two main faults. Since the scaling doesn’t correspond to musical octaves, in order to be useful each source of control voltage has to be capable of varying intervals — for instance, if you have a control voltage and you want to raise it an octave, the amount by which you have to increase it depends on its present value. Also, the volts/Hz method can, if large intervals are used, result in the use of rather high voltages to represent the higher frequencies. This can be dangerous, and it also causes difficulties with power supply design and circuit implementation (the upper limit for what most integrated circuits used in synths will tolerate is about ±20V). Compare with volts/octave.