© All Rights reserved @ LearnWithDash
Step-by-Step Solution
Step 1: Understand the Problem
We have a bulb rated at 220 volts and 100 watts. The supply voltage to this bulb decreases by 2.5% of its rated value (i.e., 2.5% of 220 V). We need to find the resulting percentage decrease in the bulb’s power output relative to its rated power.
Step 2: Recall the Power-Voltage Relationship for a Constant Resistance
If the resistance of the bulb is constant, the power consumed by the bulb is given by
$P = \frac{V^2}{R}$.
Here, $P$ is the power in watts, $V$ is the potential difference across the bulb, and $R$ is its resistance.
Step 3: Expressing the Fractional Change in Power
For small changes, the percentage (or fractional) change in power can be derived from the differential relationship:
$ \frac{\Delta P}{P} = 2 \times \frac{\Delta V}{V}\,(\text{if }R\text{ is constant}).$
Since $R$ does not change, $ \Delta R = 0$ and the factor for a small change in voltage leads to a doubled effect on $P$, because $P$ depends on $V^2$.
Step 4: Substitute the Given Percentage
Given that the voltage decreases by 2.5% of its rated value:
$ \frac{\Delta V}{V} = 2.5\%. $
Thus,
$ \frac{\Delta P}{P} = 2 \times 2.5\% = 5\%. $
Step 5: Conclude the Result
This means the power output drops by 5% of its rated power if the voltage drops by 2.5% of its rated value.