If you have any queation, please contact us: (86-755)-84811973

Microphone Sensitivity

Sensitivity, the ratio of the analog output voltage or digital output value to the input pressure, is a key metric for any microphone. With the input known, the mapping from acoustic domain units to electrical domain units determines the magnitude of the microphone output signal. This article will discuss the differences in sensitivity specifications between analog and digital microphones, how to choose the best microphone for your application, and why adding a bit (or more) of digital gain can enhance the microphone signal.
analog and digital
Microphone sensitivity is typically measured with a 1 kHz sine wave at a sound pressure level (SPL) of 94 dB (or 1 Pa (Pa) pressure). The magnitude of the analog or digital output signal of the microphone under this input excitation is a measure of the microphone sensitivity. This reference point is only one of the characteristics of the microphone and does not represent the entirety of the microphone’s performance.
The sensitivity of an analog microphone is simple and not difficult to understand. This metric is generally expressed in logarithmic units dBV (decibels relative to 1 V) and represents the volts of the output signal at a given SPL. For analog microphones, the sensitivity (expressed in linear units mV/Pa) can be expressed logarithmically in decibels:
With this information and the correct preamp gain, it is easy to match the microphone signal level to the target input level of the circuit or other part of the system. Figure 1 shows how to set the microphone’s peak output voltage (VMAX) to match the ADC’s full-scale input voltage (VIN) with a gain of VIN/VMAX. For example, with a gain of 4 (12 dB), an ADMP504 with a maximum output voltage of 0.25 V can be matched to an ADC with a full-scale peak input voltage of 1.0 V.


Post time: Aug-11-2022