© All Rights reserved @ LearnWithDash
Step-by-Step Solution
1. Understanding the Problem
We have a 200 W bulb designed to operate at 100 V. Now, the supply voltage is 200 V, but we still want the bulb to deliver its rated power (200 W). We must add an external resistor in series so that the voltage across the bulb remains at 100 V, ensuring it operates at exactly 200 W.
2. Known Data and Relevant Formulae
Power of the bulb, P = 200\,\text{W}
Rated voltage for the bulb, V_B = 100\,\text{V}
Supply voltage, V_S = 200\,\text{V}
Power formula for a resistor (or bulb): P = \frac{V^2}{R}
3. Determining the Bulb’s Resistance
The bulb’s resistance R_B when operating at its rated voltage and power can be found using the relation:
R_B = \frac{V_B^2}{P} = \frac{100^2}{200} = \frac{10{,}000}{200} = 50\,\Omega.
4. Designing the Series Resistor
We still want 100 V across the bulb even though the total supply is 200 V. Therefore, the remaining 100 V (because 200 - 100 = 100 ) must drop across the series resistor R .
Since the same current flows through both the bulb and the series resistor (series circuit), we have:
I = \frac{V_B}{R_B} = \frac{100}{50} = 2\,\text{A}.
Using Ohm’s law for the resistor R :
V_R = I \times R = 100 = 2 \times R \quad \Longrightarrow \quad R = 50\,\Omega.
5. Final Answer
Hence, the resistance that must be put in series with the bulb so that it still delivers 200 W at 100 V is:
R = 50\,\Omega.