- #1
WarpedWatch
- 38
- 0
Hi all,
Through my tears, I'm trying to compare two acoustic emission sensors. The first sensor has a specification that says its sensitivity is −150 dBre 1V/μPa. The other sensor says its sensitivity is -70 dB ref 1V/μbar. (Note the different units.)
Wikipedia tells me that 1 Pa = 10-5 bar. So I'm guessing 0.1Pa = 1μbar .
Okay, so it's been a long, long time since I used dB for anything. Frankly, using dB to compare things has never made any sense to me. And with that negative sign thrown in there, I'm really confused. Are they telling me that the sensor provides negative voltage output, or is it a negative sign that gets plugged into the "20 log rule". (When I plug that negative into the 20 log rule, I get insanely small numbers. ) :uhh:
For "20 log rule", I'm looking at this on Wikipedia: en.wikipedia.org/wiki/Decibel
If I ignore the negative and just plug and chug, I get something like this for the first sensor: Pressure needed to get 1 Volt = 10150/20 = 31.6 Pa
and this for the second sensor: Pressure needed to get 1 Volt = 1070/20 = 3162 μbar = 316 Pa
They seem to be off by an order of magnitude, but ... I noticed other sensors, which happen to have integral pre-amps, have a sensitivity rated at -24 dB ref 1V/μbar. So what to do? Anybody understand how this sort of specification is used? I'm trying to figure out how much voltage I can expect the sensors to output vs. some input of pressure. I expect the output to be in the millivolt range. but...
many thanks,
Mark
Through my tears, I'm trying to compare two acoustic emission sensors. The first sensor has a specification that says its sensitivity is −150 dBre 1V/μPa. The other sensor says its sensitivity is -70 dB ref 1V/μbar. (Note the different units.)
Wikipedia tells me that 1 Pa = 10-5 bar. So I'm guessing 0.1Pa = 1μbar .
Okay, so it's been a long, long time since I used dB for anything. Frankly, using dB to compare things has never made any sense to me. And with that negative sign thrown in there, I'm really confused. Are they telling me that the sensor provides negative voltage output, or is it a negative sign that gets plugged into the "20 log rule". (When I plug that negative into the 20 log rule, I get insanely small numbers. ) :uhh:
For "20 log rule", I'm looking at this on Wikipedia: en.wikipedia.org/wiki/Decibel
If I ignore the negative and just plug and chug, I get something like this for the first sensor: Pressure needed to get 1 Volt = 10150/20 = 31.6 Pa
and this for the second sensor: Pressure needed to get 1 Volt = 1070/20 = 3162 μbar = 316 Pa
They seem to be off by an order of magnitude, but ... I noticed other sensors, which happen to have integral pre-amps, have a sensitivity rated at -24 dB ref 1V/μbar. So what to do? Anybody understand how this sort of specification is used? I'm trying to figure out how much voltage I can expect the sensors to output vs. some input of pressure. I expect the output to be in the millivolt range. but...
many thanks,
Mark