Biomedical Engineering > GATE 2023 > Measurement and Instrumentation
A 20 mV DC signal has been superimposed with a 10 mV RMS band-limited Gaussian noise with a flat spectrum up to 5 kHz. If an integrating voltmeter is used to measure this DC signal, what is the minimum averaging time (in seconds) required to yield a 99% accurate result with 95% certainty?
A
0.1
B
1.0
C
5.0
D
10.0

Correct : d

Similar Questions

Which one of the following bridges CANNOT be used for measuring inductance?
#93 MCQ
The resistance of a thermistor is 1 kΩ at 25°C and 500 Ω at 50°C. Find the temperature coefficient of resistance (in units of °C⁻¹) at 3...
#126 NAT
A metallic strain gauge with negligible piezoresistive effect is subjected to a strain of 50 x 10⁻⁶. For the metal, Young's Modulus = 80 GPa and Poisson's Ratio...
#133 NAT

Related Topics

No tags found

Unique Visitor Count

Total Unique Visitors

Loading......